Machine Learning Engineer Nanodegree

Capstone

Project- Corn Commodity Futures Price Predictor

In this project we will try to predict closing weekly price of Corn Commodity Futures. In order to perform this prediction we will create a dataset that includes weekly Corn Futures closing prices as well as Long Open Interest and Short Open Interest of Processors/Users( sometimes they are called Commercials) from COT reports and by using this dataset we will try to predict next week’s prices.

1. Data Sets

Historical Futures Prices: Corn Futures, Continuous Contract #1. Non-adjusted price based on spot-month continuous contract calculations. Raw data from CME: Can be found here
Commitment of Traders - CORN (CBT) - Futures Only (002602) Can be found here

Data has been downloaded and stored in \Data folder:

  • .\data\CHRIS-CME_C1.csv - Corn Futures Prices data
  • .\data\CFTC-002602_F_ALL.csv - Commitment of Traders data
In [1]:
import warnings
warnings.filterwarnings('ignore')
In [2]:
import pandas as pd
import numpy as np
from IPython.core.display import display, HTML
pd.options.display.max_colwidth = 500  # You need this, otherwise pandas
# will limit your HTML strings to 50 characters
pd.set_option('display.max_columns', None)
pd.set_option('display.max_rows', None)
pd.options.mode.chained_assignment = None  # default='warn'
from matplotlib import pyplot
from sklearn.preprocessing import MinMaxScaler
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import LSTM
from math import sqrt
from numpy import concatenate
from sklearn.metrics import mean_squared_error
import matplotlib.pyplot as plt
from plotly.offline import download_plotlyjs, init_notebook_mode, plot, iplot
import cufflinks as cf
import plotly.tools as tls
init_notebook_mode(connected=True)
cf.go_offline()
Using TensorFlow backend.
C:\Users\zilvi\Anaconda3\envs\zil_tensorflow\lib\site-packages\plotly\graph_objs\_deprecations.py:558: DeprecationWarning:

plotly.graph_objs.YAxis is deprecated.
Please replace it with one of the following more specific types
  - plotly.graph_objs.layout.YAxis
  - plotly.graph_objs.layout.scene.YAxis


C:\Users\zilvi\Anaconda3\envs\zil_tensorflow\lib\site-packages\plotly\graph_objs\_deprecations.py:531: DeprecationWarning:

plotly.graph_objs.XAxis is deprecated.
Please replace it with one of the following more specific types
  - plotly.graph_objs.layout.XAxis
  - plotly.graph_objs.layout.scene.XAxis


2. Prepare and Explore Data

In [3]:
df_fut_orig = pd.read_csv('data\CHRIS-CME_C1.csv')
df_fut_orig.head(n=5)
Out[3]:
Date Open High Low Last Change Settle Volume Previous_Day_Open_Interest
0 2018-07-10 344.25 344.75 336.25 339.50 6.00 339.75 2668.0 2186.0
1 2018-07-09 346.00 348.50 342.50 346.00 6.00 345.75 3190.0 2969.0
2 2018-07-06 342.00 352.25 342.00 350.75 8.25 351.75 3068.0 3959.0
3 2018-07-05 345.50 348.75 341.50 342.50 0.75 343.50 3302.0 4812.0
4 2018-07-03 340.25 345.25 339.25 343.25 5.25 342.75 3048.0 5687.0
In [4]:
# Display a description of the dataset
display(df_fut_orig.describe())
Open High Low Last Change Settle Volume Previous_Day_Open_Interest
count 3033.000000 3034.000000 3034.000000 3034.000000 1081.000000 3034.000000 3034.000000 3034.00000
mean 457.095038 462.322924 451.795485 456.920040 3.950324 456.979318 103905.200396 352140.90145
std 140.338892 142.056030 138.436196 140.243019 3.415126 140.204571 73993.219920 248565.85531
min 219.000000 220.750000 216.750000 219.000000 0.000000 219.000000 0.000000 107.00000
25% 360.000000 363.000000 356.250000 359.500000 1.500000 359.750000 40172.750000 107559.25000
50% 388.500000 392.000000 383.500000 388.750000 3.000000 389.000000 102567.000000 365073.00000
75% 565.500000 573.562500 557.375000 564.625000 5.500000 564.625000 152391.250000 556408.50000
max 830.250000 843.750000 822.750000 831.250000 30.750000 831.250000 538170.000000 858696.00000
In [5]:
df_fut_orig['Date'] = pd.to_datetime(df_fut_orig['Date'])
df_fut_orig.set_index('Date',inplace=True)
df_fut_orig = df_fut_orig.sort_values('Date')

Plot Corn Futures Price Series using Plotly

In [6]:
%load_ext autoreload
%autoreload 2
import visuals

visuals.plot_original_price_series(df_fut_orig)

Seems there are some rows where Volume=0, lets find out more about these rows

In [7]:
df_fut_orig[df_fut_orig['Volume']<1]
Out[7]:
Open High Low Last Change Settle Volume Previous_Day_Open_Interest
Date
2007-04-05 359.75 367.50 357.25 366.00 NaN 366.00 0.0 354349.0
2012-04-06 658.25 658.25 658.25 658.25 NaN 658.25 0.0 401521.0
2015-04-03 386.50 386.50 386.50 386.50 NaN 386.50 0.0 470964.0

Since we will resample daily prices into weekly prices , lets drop those rows.

In [8]:
# drop outliers
df_fut_orig.drop(df_fut_orig[df_fut_orig.Volume<1].index, inplace=True)
In [9]:
df_cot_orig = pd.read_csv('data\CFTC-002602_F_ALL.csv')
display(df_cot_orig.head())
Date Open_Interest Producer_Merchant_Processor_User_Longs Producer_Merchant_Processor_User_Shorts Swap Dealer Longs Swap Dealer Shorts Swap Dealer Spreads Money Manager Longs Money Manager Shorts Money Manager Spreads Other Reportable Longs Other Reportable Shorts Other Reportable Spreads Total Reportable Longs Total Reportable Shorts Non Reportable Longs Non Reportable Shorts
0 2018-07-10 1818055.0 500172.0 750062.0 208128.0 39513.0 99477.0 263353.0 404297.0 154286.0 320946.0 70682.0 98709.0 1645071.0 1617026.0 172984.0 201029.0
1 2018-07-03 1830330.0 484257.0 773851.0 210341.0 36927.0 100340.0 274795.0 382191.0 149756.0 322256.0 66508.0 119627.0 1661372.0 1629200.0 168958.0 201130.0
2 2018-06-26 1885804.0 513100.0 840177.0 223131.0 32763.0 91972.0 287061.0 377825.0 153461.0 330396.0 58283.0 116745.0 1715866.0 1671226.0 169938.0 214578.0
3 2018-06-19 1992169.0 525197.0 920764.0 222105.0 41144.0 99285.0 299377.0 356828.0 163454.0 379025.0 56652.0 135078.0 1823521.0 1773205.0 168648.0 218964.0
4 2018-06-12 1963233.0 488666.0 917204.0 235249.0 37674.0 93281.0 292054.0 304292.0 172623.0 363918.0 65030.0 147098.0 1792889.0 1737202.0 170344.0 226031.0
In [10]:
display(df_cot_orig.describe())
Open_Interest Producer_Merchant_Processor_User_Longs Producer_Merchant_Processor_User_Shorts Swap Dealer Longs Swap Dealer Shorts Swap Dealer Spreads Money Manager Longs Money Manager Shorts Money Manager Spreads Other Reportable Longs Other Reportable Shorts Other Reportable Spreads Total Reportable Longs Total Reportable Shorts Non Reportable Longs Non Reportable Shorts
count 6.310000e+02 631.000000 6.310000e+02 631.000000 631.000000 631.000000 631.000000 631.000000 631.000000 631.000000 631.000000 631.000000 6.310000e+02 6.310000e+02 631.000000 631.000000
mean 1.292201e+06 270795.049128 6.268425e+05 290792.497623 20337.034865 33260.068146 236884.269414 137472.426307 94546.356577 140931.890650 70914.334390 85505.109350 1.152715e+06 1.068878e+06 139485.541997 223322.976228
std 2.095471e+05 68976.221600 1.554272e+05 53203.484072 18944.008732 22912.567257 67454.195123 109465.025186 32739.133163 51939.690903 26360.863384 29682.425476 1.939790e+05 2.060080e+05 23718.957966 29824.710288
min 7.482520e+05 102373.000000 2.972960e+05 186981.000000 0.000000 4397.000000 96989.000000 6714.000000 29130.000000 49809.000000 25905.000000 27592.000000 6.379810e+05 5.689510e+05 78578.000000 156086.000000
25% 1.192226e+06 226595.000000 5.235930e+05 255196.500000 6524.000000 13978.000000 186366.500000 47947.000000 72018.500000 104764.000000 53331.000000 62690.000000 1.055362e+06 9.573815e+05 121829.500000 198860.500000
50% 1.301506e+06 262823.000000 6.112810e+05 276337.000000 15239.000000 27209.000000 225682.000000 95548.000000 91850.000000 140343.000000 66261.000000 82705.000000 1.166372e+06 1.067548e+06 136966.000000 227337.000000
75% 1.398275e+06 314224.000000 7.058555e+05 321265.500000 28178.000000 48009.500000 287331.000000 211154.000000 113803.000000 175846.000000 83448.500000 106077.500000 1.247976e+06 1.180280e+06 153542.500000 246903.000000
max 1.992169e+06 525197.000000 1.001517e+06 422803.000000 95591.000000 113775.000000 431569.000000 447470.000000 231064.000000 379025.000000 173322.000000 181385.000000 1.825238e+06 1.773205e+06 206821.000000 293948.000000

Drop unnecessary columns columns and resample data

In [11]:
df_fut=df_fut_orig.drop(columns=[clmn for i,clmn in enumerate(df_fut_orig.columns) if i not in [5,6,7] ],axis=1)
display(df_fut.head())
Settle Volume Previous_Day_Open_Interest
Date
2006-06-16 235.50 56486.0 203491.0
2006-06-19 229.75 51299.0 190044.0
2006-06-20 229.75 41605.0 175859.0
2006-06-21 232.75 29803.0 162348.0
2006-06-22 230.50 28687.0 147658.0
In [12]:
s_settle =df_fut['Settle'].resample('W').last()
s_volume =df_fut['Volume'].resample('W').last()
df_fut_weekly = pd.concat([s_settle,s_volume], axis=1)
display(df_fut_weekly.head())
Settle Volume
Date
2006-06-18 235.50 56486.0
2006-06-25 228.25 28361.0
2006-07-02 235.50 30519.0
2006-07-09 241.00 13057.0
2006-07-16 253.50 2460.0
In [13]:
df_cot=df_cot_orig.drop(columns=[clmn for i,clmn in enumerate(df_cot_orig.columns) if i not in [0,1,2,3 ]],axis=1)
df_cot.rename(index=str, columns={"Producer_Merchant_Processor_User_Longs": "Longs", \
                                  "Producer_Merchant_Processor_User_Shorts": "Shorts"},inplace=True)
df_cot['Date'] = pd.to_datetime(df_cot['Date'])
df_cot.set_index('Date',inplace=True)
display(df_cot.head())
Open_Interest Longs Shorts
Date
2018-07-10 1818055.0 500172.0 750062.0
2018-07-03 1830330.0 484257.0 773851.0
2018-06-26 1885804.0 513100.0 840177.0
2018-06-19 1992169.0 525197.0 920764.0
2018-06-12 1963233.0 488666.0 917204.0
In [14]:
s_longs =df_cot['Longs'].resample('W').last()
s_shorts =df_cot['Shorts'].resample('W').last()
s_open_interest =df_cot['Open_Interest'].resample('W').last()
df_cot_weekly = pd.concat([s_open_interest,s_longs, s_shorts], axis=1)
display(df_cot_weekly.head(5))
Open_Interest Longs Shorts
Date
2006-06-18 1320155.0 209662.0 699163.0
2006-06-25 1321520.0 224476.0 666688.0
2006-07-02 1329400.0 234769.0 645735.0
2006-07-09 1327482.0 220552.0 648405.0
2006-07-16 1333225.0 216968.0 673110.0
In [15]:
df_weekly = pd.merge(df_fut_weekly,df_cot_weekly, on='Date')
display(df_weekly.head(5))
Settle Volume Open_Interest Longs Shorts
Date
2006-06-18 235.50 56486.0 1320155.0 209662.0 699163.0
2006-06-25 228.25 28361.0 1321520.0 224476.0 666688.0
2006-07-02 235.50 30519.0 1329400.0 234769.0 645735.0
2006-07-09 241.00 13057.0 1327482.0 220552.0 648405.0
2006-07-16 253.50 2460.0 1333225.0 216968.0 673110.0
In [16]:
# Display a description of the dataset
display(df_weekly.describe())
Settle Volume Open_Interest Longs Shorts
count 631.000000 631.000000 6.310000e+02 631.000000 6.310000e+02
mean 456.978605 100835.204437 1.292201e+06 270795.049128 6.268425e+05
std 140.242112 72466.341538 2.095471e+05 68976.221600 1.554272e+05
min 219.750000 132.000000 7.482520e+05 102373.000000 2.972960e+05
25% 359.500000 34822.500000 1.192226e+06 226595.000000 5.235930e+05
50% 389.250000 101209.000000 1.301506e+06 262823.000000 6.112810e+05
75% 560.375000 150341.000000 1.398275e+06 314224.000000 7.058555e+05
max 824.500000 369522.000000 1.992169e+06 525197.000000 1.001517e+06
In [17]:
# rest index since we need row numbers for splitting
df_weekly_idx_date=df_weekly.copy()
df_weekly.reset_index(inplace=True)

3. Visualise Data

In [18]:
%load_ext autoreload
%autoreload 2
import visuals

visuals.plot_weekly_combined_series_by_date(df_weekly)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
In [19]:
%load_ext autoreload
%autoreload 2
import visuals

visuals.plot_weekly_combined_series_by_trading_week(df_weekly)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
In [20]:
%load_ext autoreload
%autoreload 2
import visuals
visuals.plot_grouped_by_year_data(df_weekly_idx_date,"Stacked Plots of Price by Year")
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
In [21]:
%load_ext autoreload
%autoreload 2
import visuals
visuals.lag_plot(df_weekly,"Lag Plot")
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload

4. Normalise the data using minmaxscaler

In [22]:
scaler = MinMaxScaler(feature_range=(0, 1))
values = df_weekly.loc[:, df_weekly.columns != 'Date'].values
scaled = scaler.fit_transform(values)

5. Split data into training, validation and test sets

In [23]:
validation_start=df_weekly[df_weekly['Date'] >= pd.to_datetime('2017-01-01')].index[0]
testing_start=df_weekly[df_weekly['Date'] >= pd.to_datetime('2018-01-01')].index[0]
In [24]:
print("validation start",validation_start)
print("testing start",testing_start)
validation start 550
testing start 603
In [25]:
# print data to double check
#print(df_weekly.iloc[validation_start])
#print(df_weekly.iloc[testing_start])
In [26]:
%load_ext autoreload
%autoreload 2
import data_preparer
reframed = data_preparer.series_to_supervised(scaled, 1, 1)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
In [27]:
# drop columns we don't want to predict
reframed.drop(reframed.columns[[6,7,8,9]], axis=1, inplace=True)
In [28]:
display(reframed.head())
var1(t-1) var2(t-1) var3(t-1) var4(t-1) var5(t-1) var1(t)
1 0.026044 0.152560 0.459760 0.253744 0.570655 0.014055
2 0.014055 0.076421 0.460857 0.288780 0.524540 0.026044
3 0.026044 0.082263 0.467192 0.313123 0.494786 0.035138
4 0.035138 0.034990 0.465650 0.279499 0.498578 0.055808
5 0.055808 0.006302 0.470267 0.271023 0.533659 0.028938

6. Define and Fit Model

In [29]:
%load_ext autoreload
%autoreload 2
import data_preparer
train_X, train_y, validation_X, validation_y,test_X, test_y = data_preparer.split_data(reframed,validation_start,testing_start)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
In [54]:
%load_ext autoreload
%autoreload 2
import models
model,history=models.basic_lstm_model(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
Train on 550 samples, validate on 53 samples
Epoch 1/500
 - 14s - loss: 0.4491 - val_loss: 0.2591
Epoch 2/500
 - 0s - loss: 0.4371 - val_loss: 0.2465
Epoch 3/500
 - 0s - loss: 0.4250 - val_loss: 0.2341
Epoch 4/500
 - 0s - loss: 0.4133 - val_loss: 0.2219
Epoch 5/500
 - 0s - loss: 0.4019 - val_loss: 0.2101
Epoch 6/500
 - 0s - loss: 0.3908 - val_loss: 0.1985
Epoch 7/500
 - 0s - loss: 0.3801 - val_loss: 0.1872
Epoch 8/500
 - 0s - loss: 0.3696 - val_loss: 0.1761
Epoch 9/500
 - 0s - loss: 0.3593 - val_loss: 0.1652
Epoch 10/500
 - 0s - loss: 0.3493 - val_loss: 0.1545
Epoch 11/500
 - 0s - loss: 0.3393 - val_loss: 0.1440
Epoch 12/500
 - 0s - loss: 0.3296 - val_loss: 0.1336
Epoch 13/500
 - 0s - loss: 0.3199 - val_loss: 0.1233
Epoch 14/500
 - 0s - loss: 0.3104 - val_loss: 0.1131
Epoch 15/500
 - 0s - loss: 0.3009 - val_loss: 0.1031
Epoch 16/500
 - 0s - loss: 0.2915 - val_loss: 0.0931
Epoch 17/500
 - 0s - loss: 0.2823 - val_loss: 0.0833
Epoch 18/500
 - 0s - loss: 0.2732 - val_loss: 0.0735
Epoch 19/500
 - 0s - loss: 0.2643 - val_loss: 0.0639
Epoch 20/500
 - 0s - loss: 0.2559 - val_loss: 0.0545
Epoch 21/500
 - 0s - loss: 0.2479 - val_loss: 0.0455
Epoch 22/500
 - 0s - loss: 0.2405 - val_loss: 0.0367
Epoch 23/500
 - 0s - loss: 0.2337 - val_loss: 0.0287
Epoch 24/500
 - 0s - loss: 0.2275 - val_loss: 0.0231
Epoch 25/500
 - 0s - loss: 0.2219 - val_loss: 0.0191
Epoch 26/500
 - 0s - loss: 0.2170 - val_loss: 0.0171
Epoch 27/500
 - 0s - loss: 0.2129 - val_loss: 0.0168
Epoch 28/500
 - 0s - loss: 0.2095 - val_loss: 0.0177
Epoch 29/500
 - 0s - loss: 0.2067 - val_loss: 0.0196
Epoch 30/500
 - 0s - loss: 0.2044 - val_loss: 0.0219
Epoch 31/500
 - 0s - loss: 0.2025 - val_loss: 0.0245
Epoch 32/500
 - 0s - loss: 0.2009 - val_loss: 0.0272
Epoch 33/500
 - 0s - loss: 0.1996 - val_loss: 0.0301
Epoch 34/500
 - 0s - loss: 0.1986 - val_loss: 0.0328
Epoch 35/500
 - 0s - loss: 0.1977 - val_loss: 0.0352
Epoch 36/500
 - 0s - loss: 0.1970 - val_loss: 0.0374
Epoch 37/500
 - 0s - loss: 0.1964 - val_loss: 0.0393
Epoch 38/500
 - 0s - loss: 0.1959 - val_loss: 0.0412
Epoch 39/500
 - 0s - loss: 0.1955 - val_loss: 0.0429
Epoch 40/500
 - 0s - loss: 0.1951 - val_loss: 0.0444
Epoch 41/500
 - 0s - loss: 0.1948 - val_loss: 0.0459
Epoch 42/500
 - 0s - loss: 0.1945 - val_loss: 0.0472
Epoch 43/500
 - 0s - loss: 0.1943 - val_loss: 0.0484
Epoch 44/500
 - 0s - loss: 0.1941 - val_loss: 0.0496
Epoch 45/500
 - 0s - loss: 0.1939 - val_loss: 0.0507
Epoch 46/500
 - 0s - loss: 0.1937 - val_loss: 0.0516
Epoch 47/500
 - 0s - loss: 0.1935 - val_loss: 0.0525
Epoch 48/500
 - 0s - loss: 0.1934 - val_loss: 0.0533
Epoch 49/500
 - 0s - loss: 0.1932 - val_loss: 0.0541
Epoch 50/500
 - 0s - loss: 0.1931 - val_loss: 0.0548
Epoch 51/500
 - 0s - loss: 0.1930 - val_loss: 0.0554
Epoch 52/500
 - 0s - loss: 0.1929 - val_loss: 0.0560
Epoch 53/500
 - 0s - loss: 0.1928 - val_loss: 0.0566
Epoch 54/500
 - 0s - loss: 0.1927 - val_loss: 0.0572
Epoch 55/500
 - 0s - loss: 0.1926 - val_loss: 0.0577
Epoch 56/500
 - 0s - loss: 0.1925 - val_loss: 0.0582
Epoch 57/500
 - 0s - loss: 0.1924 - val_loss: 0.0587
Epoch 58/500
 - 0s - loss: 0.1923 - val_loss: 0.0591
Epoch 59/500
 - 0s - loss: 0.1922 - val_loss: 0.0596
Epoch 60/500
 - 0s - loss: 0.1921 - val_loss: 0.0599
Epoch 61/500
 - 0s - loss: 0.1921 - val_loss: 0.0602
Epoch 62/500
 - 0s - loss: 0.1920 - val_loss: 0.0605
Epoch 63/500
 - 0s - loss: 0.1920 - val_loss: 0.0608
Epoch 64/500
 - 0s - loss: 0.1919 - val_loss: 0.0610
Epoch 65/500
 - 0s - loss: 0.1918 - val_loss: 0.0613
Epoch 66/500
 - 0s - loss: 0.1918 - val_loss: 0.0615
Epoch 67/500
 - 0s - loss: 0.1917 - val_loss: 0.0618
Epoch 68/500
 - 0s - loss: 0.1917 - val_loss: 0.0620
Epoch 69/500
 - 0s - loss: 0.1916 - val_loss: 0.0623
Epoch 70/500
 - 0s - loss: 0.1916 - val_loss: 0.0625
Epoch 71/500
 - 0s - loss: 0.1915 - val_loss: 0.0628
Epoch 72/500
 - 0s - loss: 0.1915 - val_loss: 0.0630
Epoch 73/500
 - 0s - loss: 0.1914 - val_loss: 0.0633
Epoch 74/500
 - 0s - loss: 0.1914 - val_loss: 0.0635
Epoch 75/500
 - 0s - loss: 0.1913 - val_loss: 0.0638
Epoch 76/500
 - 0s - loss: 0.1913 - val_loss: 0.0640
Epoch 77/500
 - 0s - loss: 0.1912 - val_loss: 0.0642
Epoch 78/500
 - 0s - loss: 0.1912 - val_loss: 0.0644
Epoch 79/500
 - 0s - loss: 0.1911 - val_loss: 0.0646
Epoch 80/500
 - 0s - loss: 0.1911 - val_loss: 0.0649
Epoch 81/500
 - 0s - loss: 0.1911 - val_loss: 0.0650
Epoch 82/500
 - 0s - loss: 0.1910 - val_loss: 0.0652
Epoch 83/500
 - 0s - loss: 0.1910 - val_loss: 0.0653
Epoch 84/500
 - 0s - loss: 0.1910 - val_loss: 0.0654
Epoch 85/500
 - 0s - loss: 0.1909 - val_loss: 0.0655
Epoch 86/500
 - 0s - loss: 0.1909 - val_loss: 0.0656
Epoch 87/500
 - 0s - loss: 0.1909 - val_loss: 0.0656
Epoch 88/500
 - 0s - loss: 0.1909 - val_loss: 0.0657
Epoch 89/500
 - 0s - loss: 0.1908 - val_loss: 0.0658
Epoch 90/500
 - 0s - loss: 0.1908 - val_loss: 0.0659
Epoch 91/500
 - 0s - loss: 0.1908 - val_loss: 0.0660
Epoch 92/500
 - 0s - loss: 0.1908 - val_loss: 0.0661
Epoch 93/500
 - 0s - loss: 0.1907 - val_loss: 0.0662
Epoch 94/500
 - 0s - loss: 0.1907 - val_loss: 0.0663
Epoch 95/500
 - 0s - loss: 0.1907 - val_loss: 0.0663
Epoch 96/500
 - 0s - loss: 0.1907 - val_loss: 0.0664
Epoch 97/500
 - 0s - loss: 0.1906 - val_loss: 0.0665
Epoch 98/500
 - 0s - loss: 0.1906 - val_loss: 0.0666
Epoch 99/500
 - 0s - loss: 0.1906 - val_loss: 0.0667
Epoch 100/500
 - 0s - loss: 0.1906 - val_loss: 0.0668
Epoch 101/500
 - 0s - loss: 0.1905 - val_loss: 0.0668
Epoch 102/500
 - 0s - loss: 0.1905 - val_loss: 0.0668
Epoch 103/500
 - 0s - loss: 0.1905 - val_loss: 0.0668
Epoch 104/500
 - 0s - loss: 0.1905 - val_loss: 0.0668
Epoch 105/500
 - 0s - loss: 0.1905 - val_loss: 0.0668
Epoch 106/500
 - 0s - loss: 0.1905 - val_loss: 0.0668
Epoch 107/500
 - 0s - loss: 0.1904 - val_loss: 0.0668
Epoch 108/500
 - 0s - loss: 0.1904 - val_loss: 0.0668
Epoch 109/500
 - 0s - loss: 0.1904 - val_loss: 0.0668
Epoch 110/500
 - 0s - loss: 0.1904 - val_loss: 0.0668
Epoch 111/500
 - 0s - loss: 0.1904 - val_loss: 0.0668
Epoch 112/500
 - 0s - loss: 0.1904 - val_loss: 0.0668
Epoch 113/500
 - 0s - loss: 0.1904 - val_loss: 0.0668
Epoch 114/500
 - 0s - loss: 0.1903 - val_loss: 0.0668
Epoch 115/500
 - 0s - loss: 0.1903 - val_loss: 0.0668
Epoch 116/500
 - 0s - loss: 0.1903 - val_loss: 0.0668
Epoch 117/500
 - 0s - loss: 0.1903 - val_loss: 0.0668
Epoch 118/500
 - 0s - loss: 0.1903 - val_loss: 0.0667
Epoch 119/500
 - 0s - loss: 0.1903 - val_loss: 0.0667
Epoch 120/500
 - 0s - loss: 0.1903 - val_loss: 0.0667
Epoch 121/500
 - 0s - loss: 0.1902 - val_loss: 0.0667
Epoch 122/500
 - 0s - loss: 0.1902 - val_loss: 0.0667
Epoch 123/500
 - 0s - loss: 0.1902 - val_loss: 0.0666
Epoch 124/500
 - 0s - loss: 0.1902 - val_loss: 0.0666
Epoch 125/500
 - 0s - loss: 0.1902 - val_loss: 0.0666
Epoch 126/500
 - 0s - loss: 0.1902 - val_loss: 0.0666
Epoch 127/500
 - 0s - loss: 0.1902 - val_loss: 0.0666
Epoch 128/500
 - 0s - loss: 0.1902 - val_loss: 0.0665
Epoch 129/500
 - 0s - loss: 0.1901 - val_loss: 0.0665
Epoch 130/500
 - 0s - loss: 0.1901 - val_loss: 0.0665
Epoch 131/500
 - 0s - loss: 0.1901 - val_loss: 0.0665
Epoch 132/500
 - 0s - loss: 0.1901 - val_loss: 0.0665
Epoch 133/500
 - 0s - loss: 0.1901 - val_loss: 0.0664
Epoch 134/500
 - 0s - loss: 0.1901 - val_loss: 0.0664
Epoch 135/500
 - 0s - loss: 0.1901 - val_loss: 0.0664
Epoch 136/500
 - 0s - loss: 0.1900 - val_loss: 0.0664
Epoch 137/500
 - 0s - loss: 0.1900 - val_loss: 0.0664
Epoch 138/500
 - 0s - loss: 0.1900 - val_loss: 0.0664
Epoch 139/500
 - 0s - loss: 0.1900 - val_loss: 0.0664
Epoch 140/500
 - 0s - loss: 0.1900 - val_loss: 0.0664
Epoch 141/500
 - 0s - loss: 0.1900 - val_loss: 0.0664
Epoch 142/500
 - 0s - loss: 0.1899 - val_loss: 0.0665
Epoch 143/500
 - 0s - loss: 0.1899 - val_loss: 0.0665
Epoch 144/500
 - 0s - loss: 0.1899 - val_loss: 0.0665
Epoch 145/500
 - 0s - loss: 0.1899 - val_loss: 0.0665
Epoch 146/500
 - 0s - loss: 0.1899 - val_loss: 0.0665
Epoch 147/500
 - 0s - loss: 0.1898 - val_loss: 0.0665
Epoch 148/500
 - 0s - loss: 0.1898 - val_loss: 0.0665
Epoch 149/500
 - 0s - loss: 0.1898 - val_loss: 0.0665
Epoch 150/500
 - 0s - loss: 0.1898 - val_loss: 0.0665
Epoch 151/500
 - 0s - loss: 0.1898 - val_loss: 0.0665
Epoch 152/500
 - 0s - loss: 0.1897 - val_loss: 0.0665
Epoch 153/500
 - 0s - loss: 0.1897 - val_loss: 0.0665
Epoch 154/500
 - 0s - loss: 0.1897 - val_loss: 0.0665
Epoch 155/500
 - 0s - loss: 0.1897 - val_loss: 0.0665
Epoch 156/500
 - 0s - loss: 0.1896 - val_loss: 0.0665
Epoch 157/500
 - 0s - loss: 0.1896 - val_loss: 0.0665
Epoch 158/500
 - 0s - loss: 0.1896 - val_loss: 0.0665
Epoch 159/500
 - 0s - loss: 0.1896 - val_loss: 0.0665
Epoch 160/500
 - 0s - loss: 0.1895 - val_loss: 0.0664
Epoch 161/500
 - 0s - loss: 0.1895 - val_loss: 0.0664
Epoch 162/500
 - 0s - loss: 0.1895 - val_loss: 0.0664
Epoch 163/500
 - 0s - loss: 0.1894 - val_loss: 0.0664
Epoch 164/500
 - 0s - loss: 0.1894 - val_loss: 0.0664
Epoch 165/500
 - 0s - loss: 0.1893 - val_loss: 0.0664
Epoch 166/500
 - 0s - loss: 0.1893 - val_loss: 0.0664
Epoch 167/500
 - 0s - loss: 0.1893 - val_loss: 0.0664
Epoch 168/500
 - 0s - loss: 0.1892 - val_loss: 0.0663
Epoch 169/500
 - 0s - loss: 0.1892 - val_loss: 0.0663
Epoch 170/500
 - 0s - loss: 0.1891 - val_loss: 0.0663
Epoch 171/500
 - 0s - loss: 0.1891 - val_loss: 0.0662
Epoch 172/500
 - 0s - loss: 0.1890 - val_loss: 0.0662
Epoch 173/500
 - 0s - loss: 0.1890 - val_loss: 0.0661
Epoch 174/500
 - 0s - loss: 0.1889 - val_loss: 0.0660
Epoch 175/500
 - 0s - loss: 0.1889 - val_loss: 0.0659
Epoch 176/500
 - 0s - loss: 0.1888 - val_loss: 0.0659
Epoch 177/500
 - 0s - loss: 0.1887 - val_loss: 0.0658
Epoch 178/500
 - 0s - loss: 0.1887 - val_loss: 0.0657
Epoch 179/500
 - 0s - loss: 0.1886 - val_loss: 0.0656
Epoch 180/500
 - 0s - loss: 0.1885 - val_loss: 0.0655
Epoch 181/500
 - 0s - loss: 0.1884 - val_loss: 0.0654
Epoch 182/500
 - 0s - loss: 0.1883 - val_loss: 0.0653
Epoch 183/500
 - 0s - loss: 0.1882 - val_loss: 0.0652
Epoch 184/500
 - 0s - loss: 0.1882 - val_loss: 0.0651
Epoch 185/500
 - 0s - loss: 0.1880 - val_loss: 0.0650
Epoch 186/500
 - 0s - loss: 0.1879 - val_loss: 0.0649
Epoch 187/500
 - 0s - loss: 0.1878 - val_loss: 0.0647
Epoch 188/500
 - 0s - loss: 0.1877 - val_loss: 0.0646
Epoch 189/500
 - 0s - loss: 0.1876 - val_loss: 0.0645
Epoch 190/500
 - 0s - loss: 0.1874 - val_loss: 0.0644
Epoch 191/500
 - 0s - loss: 0.1873 - val_loss: 0.0642
Epoch 192/500
 - 0s - loss: 0.1871 - val_loss: 0.0641
Epoch 193/500
 - 0s - loss: 0.1869 - val_loss: 0.0639
Epoch 194/500
 - 0s - loss: 0.1867 - val_loss: 0.0638
Epoch 195/500
 - 0s - loss: 0.1865 - val_loss: 0.0636
Epoch 196/500
 - 0s - loss: 0.1863 - val_loss: 0.0635
Epoch 197/500
 - 0s - loss: 0.1861 - val_loss: 0.0634
Epoch 198/500
 - 0s - loss: 0.1858 - val_loss: 0.0633
Epoch 199/500
 - 0s - loss: 0.1856 - val_loss: 0.0632
Epoch 200/500
 - 0s - loss: 0.1853 - val_loss: 0.0630
Epoch 201/500
 - 0s - loss: 0.1849 - val_loss: 0.0628
Epoch 202/500
 - 0s - loss: 0.1846 - val_loss: 0.0627
Epoch 203/500
 - 0s - loss: 0.1842 - val_loss: 0.0625
Epoch 204/500
 - 0s - loss: 0.1838 - val_loss: 0.0623
Epoch 205/500
 - 0s - loss: 0.1833 - val_loss: 0.0621
Epoch 206/500
 - 0s - loss: 0.1828 - val_loss: 0.0619
Epoch 207/500
 - 0s - loss: 0.1822 - val_loss: 0.0617
Epoch 208/500
 - 0s - loss: 0.1816 - val_loss: 0.0614
Epoch 209/500
 - 0s - loss: 0.1809 - val_loss: 0.0612
Epoch 210/500
 - 0s - loss: 0.1802 - val_loss: 0.0610
Epoch 211/500
 - 0s - loss: 0.1793 - val_loss: 0.0607
Epoch 212/500
 - 0s - loss: 0.1784 - val_loss: 0.0605
Epoch 213/500
 - 0s - loss: 0.1774 - val_loss: 0.0603
Epoch 214/500
 - 0s - loss: 0.1762 - val_loss: 0.0601
Epoch 215/500
 - 0s - loss: 0.1750 - val_loss: 0.0599
Epoch 216/500
 - 0s - loss: 0.1736 - val_loss: 0.0597
Epoch 217/500
 - 0s - loss: 0.1720 - val_loss: 0.0595
Epoch 218/500
 - 0s - loss: 0.1703 - val_loss: 0.0592
Epoch 219/500
 - 0s - loss: 0.1685 - val_loss: 0.0588
Epoch 220/500
 - 0s - loss: 0.1664 - val_loss: 0.0585
Epoch 221/500
 - 0s - loss: 0.1641 - val_loss: 0.0581
Epoch 222/500
 - 0s - loss: 0.1617 - val_loss: 0.0576
Epoch 223/500
 - 0s - loss: 0.1590 - val_loss: 0.0571
Epoch 224/500
 - 0s - loss: 0.1560 - val_loss: 0.0565
Epoch 225/500
 - 0s - loss: 0.1529 - val_loss: 0.0560
Epoch 226/500
 - 0s - loss: 0.1495 - val_loss: 0.0554
Epoch 227/500
 - 0s - loss: 0.1458 - val_loss: 0.0545
Epoch 228/500
 - 0s - loss: 0.1419 - val_loss: 0.0530
Epoch 229/500
 - 0s - loss: 0.1378 - val_loss: 0.0509
Epoch 230/500
 - 0s - loss: 0.1335 - val_loss: 0.0482
Epoch 231/500
 - 0s - loss: 0.1291 - val_loss: 0.0451
Epoch 232/500
 - 0s - loss: 0.1245 - val_loss: 0.0418
Epoch 233/500
 - 0s - loss: 0.1197 - val_loss: 0.0388
Epoch 234/500
 - 0s - loss: 0.1147 - val_loss: 0.0358
Epoch 235/500
 - 0s - loss: 0.1097 - val_loss: 0.0327
Epoch 236/500
 - 0s - loss: 0.1047 - val_loss: 0.0305
Epoch 237/500
 - 0s - loss: 0.0996 - val_loss: 0.0294
Epoch 238/500
 - 0s - loss: 0.0946 - val_loss: 0.0285
Epoch 239/500
 - 0s - loss: 0.0899 - val_loss: 0.0272
Epoch 240/500
 - 0s - loss: 0.0853 - val_loss: 0.0260
Epoch 241/500
 - 0s - loss: 0.0807 - val_loss: 0.0250
Epoch 242/500
 - 0s - loss: 0.0765 - val_loss: 0.0238
Epoch 243/500
 - 0s - loss: 0.0724 - val_loss: 0.0228
Epoch 244/500
 - 0s - loss: 0.0685 - val_loss: 0.0221
Epoch 245/500
 - 0s - loss: 0.0650 - val_loss: 0.0216
Epoch 246/500
 - 0s - loss: 0.0616 - val_loss: 0.0215
Epoch 247/500
 - 0s - loss: 0.0585 - val_loss: 0.0218
Epoch 248/500
 - 0s - loss: 0.0557 - val_loss: 0.0222
Epoch 249/500
 - 0s - loss: 0.0532 - val_loss: 0.0225
Epoch 250/500
 - 0s - loss: 0.0511 - val_loss: 0.0228
Epoch 251/500
 - 0s - loss: 0.0493 - val_loss: 0.0233
Epoch 252/500
 - 0s - loss: 0.0477 - val_loss: 0.0241
Epoch 253/500
 - 0s - loss: 0.0464 - val_loss: 0.0251
Epoch 254/500
 - 0s - loss: 0.0449 - val_loss: 0.0251
Epoch 255/500
 - 0s - loss: 0.0437 - val_loss: 0.0254
Epoch 256/500
 - 0s - loss: 0.0427 - val_loss: 0.0257
Epoch 257/500
 - 0s - loss: 0.0418 - val_loss: 0.0259
Epoch 258/500
 - 0s - loss: 0.0412 - val_loss: 0.0263
Epoch 259/500
 - 0s - loss: 0.0406 - val_loss: 0.0262
Epoch 260/500
 - 0s - loss: 0.0402 - val_loss: 0.0264
Epoch 261/500
 - 0s - loss: 0.0397 - val_loss: 0.0266
Epoch 262/500
 - 0s - loss: 0.0394 - val_loss: 0.0268
Epoch 263/500
 - 0s - loss: 0.0390 - val_loss: 0.0266
Epoch 264/500
 - 0s - loss: 0.0386 - val_loss: 0.0266
Epoch 265/500
 - 0s - loss: 0.0383 - val_loss: 0.0268
Epoch 266/500
 - 0s - loss: 0.0381 - val_loss: 0.0272
Epoch 267/500
 - 0s - loss: 0.0378 - val_loss: 0.0272
Epoch 268/500
 - 0s - loss: 0.0376 - val_loss: 0.0271
Epoch 269/500
 - 0s - loss: 0.0373 - val_loss: 0.0270
Epoch 270/500
 - 0s - loss: 0.0371 - val_loss: 0.0269
Epoch 271/500
 - 0s - loss: 0.0369 - val_loss: 0.0268
Epoch 272/500
 - 0s - loss: 0.0366 - val_loss: 0.0263
Epoch 273/500
 - 0s - loss: 0.0364 - val_loss: 0.0261
Epoch 274/500
 - 0s - loss: 0.0362 - val_loss: 0.0260
Epoch 275/500
 - 0s - loss: 0.0360 - val_loss: 0.0260
Epoch 276/500
 - 0s - loss: 0.0359 - val_loss: 0.0259
Epoch 277/500
 - 0s - loss: 0.0357 - val_loss: 0.0256
Epoch 278/500
 - 0s - loss: 0.0355 - val_loss: 0.0253
Epoch 279/500
 - 0s - loss: 0.0354 - val_loss: 0.0251
Epoch 280/500
 - 0s - loss: 0.0352 - val_loss: 0.0248
Epoch 281/500
 - 0s - loss: 0.0350 - val_loss: 0.0245
Epoch 282/500
 - 0s - loss: 0.0349 - val_loss: 0.0243
Epoch 283/500
 - 0s - loss: 0.0347 - val_loss: 0.0243
Epoch 284/500
 - 0s - loss: 0.0346 - val_loss: 0.0240
Epoch 285/500
 - 0s - loss: 0.0344 - val_loss: 0.0240
Epoch 286/500
 - 0s - loss: 0.0343 - val_loss: 0.0239
Epoch 287/500
 - 0s - loss: 0.0342 - val_loss: 0.0237
Epoch 288/500
 - 0s - loss: 0.0341 - val_loss: 0.0237
Epoch 289/500
 - 0s - loss: 0.0339 - val_loss: 0.0234
Epoch 290/500
 - 0s - loss: 0.0339 - val_loss: 0.0235
Epoch 291/500
 - 0s - loss: 0.0337 - val_loss: 0.0229
Epoch 292/500
 - 0s - loss: 0.0337 - val_loss: 0.0233
Epoch 293/500
 - 0s - loss: 0.0335 - val_loss: 0.0231
Epoch 294/500
 - 0s - loss: 0.0334 - val_loss: 0.0230
Epoch 295/500
 - 0s - loss: 0.0334 - val_loss: 0.0230
Epoch 296/500
 - 0s - loss: 0.0333 - val_loss: 0.0230
Epoch 297/500
 - 0s - loss: 0.0332 - val_loss: 0.0227
Epoch 298/500
 - 0s - loss: 0.0331 - val_loss: 0.0228
Epoch 299/500
 - 0s - loss: 0.0331 - val_loss: 0.0228
Epoch 300/500
 - 0s - loss: 0.0330 - val_loss: 0.0225
Epoch 301/500
 - 0s - loss: 0.0329 - val_loss: 0.0225
Epoch 302/500
 - 0s - loss: 0.0328 - val_loss: 0.0225
Epoch 303/500
 - 0s - loss: 0.0328 - val_loss: 0.0224
Epoch 304/500
 - 0s - loss: 0.0327 - val_loss: 0.0221
Epoch 305/500
 - 0s - loss: 0.0326 - val_loss: 0.0221
Epoch 306/500
 - 0s - loss: 0.0326 - val_loss: 0.0222
Epoch 307/500
 - 0s - loss: 0.0325 - val_loss: 0.0219
Epoch 308/500
 - 0s - loss: 0.0324 - val_loss: 0.0217
Epoch 309/500
 - 0s - loss: 0.0324 - val_loss: 0.0217
Epoch 310/500
 - 0s - loss: 0.0323 - val_loss: 0.0214
Epoch 311/500
 - 0s - loss: 0.0322 - val_loss: 0.0212
Epoch 312/500
 - 0s - loss: 0.0322 - val_loss: 0.0211
Epoch 313/500
 - 0s - loss: 0.0321 - val_loss: 0.0209
Epoch 314/500
 - 0s - loss: 0.0320 - val_loss: 0.0207
Epoch 315/500
 - 0s - loss: 0.0320 - val_loss: 0.0206
Epoch 316/500
 - 0s - loss: 0.0319 - val_loss: 0.0206
Epoch 317/500
 - 0s - loss: 0.0319 - val_loss: 0.0206
Epoch 318/500
 - 0s - loss: 0.0318 - val_loss: 0.0206
Epoch 319/500
 - 0s - loss: 0.0318 - val_loss: 0.0204
Epoch 320/500
 - 0s - loss: 0.0317 - val_loss: 0.0203
Epoch 321/500
 - 0s - loss: 0.0317 - val_loss: 0.0202
Epoch 322/500
 - 0s - loss: 0.0316 - val_loss: 0.0203
Epoch 323/500
 - 0s - loss: 0.0316 - val_loss: 0.0202
Epoch 324/500
 - 0s - loss: 0.0315 - val_loss: 0.0201
Epoch 325/500
 - 0s - loss: 0.0315 - val_loss: 0.0201
Epoch 326/500
 - 0s - loss: 0.0315 - val_loss: 0.0200
Epoch 327/500
 - 0s - loss: 0.0314 - val_loss: 0.0198
Epoch 328/500
 - 0s - loss: 0.0314 - val_loss: 0.0198
Epoch 329/500
 - 0s - loss: 0.0313 - val_loss: 0.0196
Epoch 330/500
 - 0s - loss: 0.0313 - val_loss: 0.0195
Epoch 331/500
 - 0s - loss: 0.0312 - val_loss: 0.0195
Epoch 332/500
 - 0s - loss: 0.0312 - val_loss: 0.0194
Epoch 333/500
 - 0s - loss: 0.0311 - val_loss: 0.0192
Epoch 334/500
 - 0s - loss: 0.0311 - val_loss: 0.0193
Epoch 335/500
 - 0s - loss: 0.0311 - val_loss: 0.0193
Epoch 336/500
 - 0s - loss: 0.0310 - val_loss: 0.0192
Epoch 337/500
 - 0s - loss: 0.0310 - val_loss: 0.0191
Epoch 338/500
 - 0s - loss: 0.0310 - val_loss: 0.0192
Epoch 339/500
 - 0s - loss: 0.0309 - val_loss: 0.0192
Epoch 340/500
 - 0s - loss: 0.0309 - val_loss: 0.0192
Epoch 341/500
 - 0s - loss: 0.0309 - val_loss: 0.0190
Epoch 342/500
 - 0s - loss: 0.0308 - val_loss: 0.0190
Epoch 343/500
 - 0s - loss: 0.0308 - val_loss: 0.0190
Epoch 344/500
 - 0s - loss: 0.0307 - val_loss: 0.0188
Epoch 345/500
 - 0s - loss: 0.0307 - val_loss: 0.0188
Epoch 346/500
 - 0s - loss: 0.0307 - val_loss: 0.0187
Epoch 347/500
 - 0s - loss: 0.0307 - val_loss: 0.0187
Epoch 348/500
 - 0s - loss: 0.0306 - val_loss: 0.0187
Epoch 349/500
 - 0s - loss: 0.0306 - val_loss: 0.0186
Epoch 350/500
 - 0s - loss: 0.0306 - val_loss: 0.0185
Epoch 351/500
 - 0s - loss: 0.0305 - val_loss: 0.0185
Epoch 352/500
 - 0s - loss: 0.0305 - val_loss: 0.0185
Epoch 353/500
 - 0s - loss: 0.0305 - val_loss: 0.0184
Epoch 354/500
 - 0s - loss: 0.0304 - val_loss: 0.0184
Epoch 355/500
 - 0s - loss: 0.0304 - val_loss: 0.0183
Epoch 356/500
 - 0s - loss: 0.0304 - val_loss: 0.0183
Epoch 357/500
 - 0s - loss: 0.0303 - val_loss: 0.0183
Epoch 358/500
 - 0s - loss: 0.0303 - val_loss: 0.0183
Epoch 359/500
 - 0s - loss: 0.0303 - val_loss: 0.0182
Epoch 360/500
 - 0s - loss: 0.0302 - val_loss: 0.0180
Epoch 361/500
 - 0s - loss: 0.0302 - val_loss: 0.0180
Epoch 362/500
 - 0s - loss: 0.0302 - val_loss: 0.0179
Epoch 363/500
 - 0s - loss: 0.0302 - val_loss: 0.0179
Epoch 364/500
 - 0s - loss: 0.0301 - val_loss: 0.0179
Epoch 365/500
 - 0s - loss: 0.0301 - val_loss: 0.0179
Epoch 366/500
 - 0s - loss: 0.0301 - val_loss: 0.0179
Epoch 367/500
 - 0s - loss: 0.0301 - val_loss: 0.0180
Epoch 368/500
 - 0s - loss: 0.0300 - val_loss: 0.0180
Epoch 369/500
 - 0s - loss: 0.0300 - val_loss: 0.0179
Epoch 370/500
 - 0s - loss: 0.0300 - val_loss: 0.0178
Epoch 371/500
 - 0s - loss: 0.0300 - val_loss: 0.0178
Epoch 372/500
 - 0s - loss: 0.0299 - val_loss: 0.0178
Epoch 373/500
 - 0s - loss: 0.0299 - val_loss: 0.0177
Epoch 374/500
 - 0s - loss: 0.0299 - val_loss: 0.0177
Epoch 375/500
 - 0s - loss: 0.0299 - val_loss: 0.0177
Epoch 376/500
 - 0s - loss: 0.0298 - val_loss: 0.0176
Epoch 377/500
 - 0s - loss: 0.0298 - val_loss: 0.0176
Epoch 378/500
 - 0s - loss: 0.0298 - val_loss: 0.0175
Epoch 379/500
 - 0s - loss: 0.0298 - val_loss: 0.0175
Epoch 380/500
 - 0s - loss: 0.0297 - val_loss: 0.0174
Epoch 381/500
 - 0s - loss: 0.0297 - val_loss: 0.0174
Epoch 382/500
 - 0s - loss: 0.0297 - val_loss: 0.0173
Epoch 383/500
 - 0s - loss: 0.0297 - val_loss: 0.0173
Epoch 384/500
 - 0s - loss: 0.0296 - val_loss: 0.0173
Epoch 385/500
 - 0s - loss: 0.0296 - val_loss: 0.0172
Epoch 386/500
 - 0s - loss: 0.0296 - val_loss: 0.0171
Epoch 387/500
 - 0s - loss: 0.0296 - val_loss: 0.0172
Epoch 388/500
 - 0s - loss: 0.0295 - val_loss: 0.0170
Epoch 389/500
 - 0s - loss: 0.0295 - val_loss: 0.0170
Epoch 390/500
 - 0s - loss: 0.0295 - val_loss: 0.0169
Epoch 391/500
 - 0s - loss: 0.0295 - val_loss: 0.0169
Epoch 392/500
 - 0s - loss: 0.0295 - val_loss: 0.0168
Epoch 393/500
 - 0s - loss: 0.0294 - val_loss: 0.0168
Epoch 394/500
 - 0s - loss: 0.0294 - val_loss: 0.0167
Epoch 395/500
 - 0s - loss: 0.0294 - val_loss: 0.0167
Epoch 396/500
 - 0s - loss: 0.0294 - val_loss: 0.0164
Epoch 397/500
 - 0s - loss: 0.0294 - val_loss: 0.0165
Epoch 398/500
 - 0s - loss: 0.0293 - val_loss: 0.0163
Epoch 399/500
 - 0s - loss: 0.0293 - val_loss: 0.0164
Epoch 400/500
 - 0s - loss: 0.0293 - val_loss: 0.0160
Epoch 401/500
 - 0s - loss: 0.0293 - val_loss: 0.0161
Epoch 402/500
 - 0s - loss: 0.0293 - val_loss: 0.0162
Epoch 403/500
 - 0s - loss: 0.0292 - val_loss: 0.0160
Epoch 404/500
 - 0s - loss: 0.0292 - val_loss: 0.0159
Epoch 405/500
 - 0s - loss: 0.0292 - val_loss: 0.0160
Epoch 406/500
 - 0s - loss: 0.0292 - val_loss: 0.0158
Epoch 407/500
 - 0s - loss: 0.0292 - val_loss: 0.0157
Epoch 408/500
 - 0s - loss: 0.0291 - val_loss: 0.0157
Epoch 409/500
 - 0s - loss: 0.0291 - val_loss: 0.0157
Epoch 410/500
 - 0s - loss: 0.0291 - val_loss: 0.0157
Epoch 411/500
 - 0s - loss: 0.0291 - val_loss: 0.0156
Epoch 412/500
 - 0s - loss: 0.0291 - val_loss: 0.0156
Epoch 413/500
 - 0s - loss: 0.0291 - val_loss: 0.0156
Epoch 414/500
 - 0s - loss: 0.0290 - val_loss: 0.0155
Epoch 415/500
 - 0s - loss: 0.0290 - val_loss: 0.0155
Epoch 416/500
 - 0s - loss: 0.0290 - val_loss: 0.0155
Epoch 417/500
 - 0s - loss: 0.0290 - val_loss: 0.0155
Epoch 418/500
 - 0s - loss: 0.0290 - val_loss: 0.0154
Epoch 419/500
 - 0s - loss: 0.0290 - val_loss: 0.0154
Epoch 420/500
 - 0s - loss: 0.0289 - val_loss: 0.0153
Epoch 421/500
 - 0s - loss: 0.0289 - val_loss: 0.0154
Epoch 422/500
 - 0s - loss: 0.0289 - val_loss: 0.0153
Epoch 423/500
 - 0s - loss: 0.0289 - val_loss: 0.0153
Epoch 424/500
 - 0s - loss: 0.0289 - val_loss: 0.0152
Epoch 425/500
 - 0s - loss: 0.0289 - val_loss: 0.0153
Epoch 426/500
 - 0s - loss: 0.0288 - val_loss: 0.0152
Epoch 427/500
 - 0s - loss: 0.0288 - val_loss: 0.0151
Epoch 428/500
 - 0s - loss: 0.0288 - val_loss: 0.0152
Epoch 429/500
 - 0s - loss: 0.0288 - val_loss: 0.0152
Epoch 430/500
 - 0s - loss: 0.0288 - val_loss: 0.0152
Epoch 431/500
 - 0s - loss: 0.0288 - val_loss: 0.0152
Epoch 432/500
 - 0s - loss: 0.0287 - val_loss: 0.0151
Epoch 433/500
 - 0s - loss: 0.0287 - val_loss: 0.0151
Epoch 434/500
 - 0s - loss: 0.0287 - val_loss: 0.0151
Epoch 435/500
 - 0s - loss: 0.0287 - val_loss: 0.0151
Epoch 436/500
 - 0s - loss: 0.0287 - val_loss: 0.0151
Epoch 437/500
 - 0s - loss: 0.0287 - val_loss: 0.0150
Epoch 438/500
 - 0s - loss: 0.0287 - val_loss: 0.0150
Epoch 439/500
 - 0s - loss: 0.0286 - val_loss: 0.0149
Epoch 440/500
 - 0s - loss: 0.0286 - val_loss: 0.0150
Epoch 441/500
 - 0s - loss: 0.0286 - val_loss: 0.0149
Epoch 442/500
 - 0s - loss: 0.0286 - val_loss: 0.0148
Epoch 443/500
 - 0s - loss: 0.0286 - val_loss: 0.0149
Epoch 444/500
 - 0s - loss: 0.0286 - val_loss: 0.0149
Epoch 445/500
 - 0s - loss: 0.0286 - val_loss: 0.0148
Epoch 446/500
 - 0s - loss: 0.0285 - val_loss: 0.0148
Epoch 447/500
 - 0s - loss: 0.0286 - val_loss: 0.0148
Epoch 448/500
 - 0s - loss: 0.0285 - val_loss: 0.0147
Epoch 449/500
 - 0s - loss: 0.0285 - val_loss: 0.0148
Epoch 450/500
 - 0s - loss: 0.0285 - val_loss: 0.0147
Epoch 451/500
 - 0s - loss: 0.0285 - val_loss: 0.0147
Epoch 452/500
 - 0s - loss: 0.0285 - val_loss: 0.0147
Epoch 453/500
 - 0s - loss: 0.0285 - val_loss: 0.0146
Epoch 454/500
 - 0s - loss: 0.0284 - val_loss: 0.0146
Epoch 455/500
 - 0s - loss: 0.0284 - val_loss: 0.0145
Epoch 456/500
 - 0s - loss: 0.0284 - val_loss: 0.0145
Epoch 457/500
 - 0s - loss: 0.0284 - val_loss: 0.0145
Epoch 458/500
 - 0s - loss: 0.0284 - val_loss: 0.0145
Epoch 459/500
 - 0s - loss: 0.0284 - val_loss: 0.0145
Epoch 460/500
 - 0s - loss: 0.0284 - val_loss: 0.0144
Epoch 461/500
 - 0s - loss: 0.0283 - val_loss: 0.0144
Epoch 462/500
 - 0s - loss: 0.0283 - val_loss: 0.0144
Epoch 463/500
 - 0s - loss: 0.0283 - val_loss: 0.0144
Epoch 464/500
 - 0s - loss: 0.0283 - val_loss: 0.0144
Epoch 465/500
 - 0s - loss: 0.0283 - val_loss: 0.0142
Epoch 466/500
 - 0s - loss: 0.0283 - val_loss: 0.0143
Epoch 467/500
 - 0s - loss: 0.0283 - val_loss: 0.0143
Epoch 468/500
 - 0s - loss: 0.0282 - val_loss: 0.0142
Epoch 469/500
 - 0s - loss: 0.0282 - val_loss: 0.0142
Epoch 470/500
 - 0s - loss: 0.0282 - val_loss: 0.0142
Epoch 471/500
 - 0s - loss: 0.0282 - val_loss: 0.0141
Epoch 472/500
 - 0s - loss: 0.0282 - val_loss: 0.0141
Epoch 473/500
 - 0s - loss: 0.0282 - val_loss: 0.0141
Epoch 474/500
 - 0s - loss: 0.0282 - val_loss: 0.0141
Epoch 475/500
 - 0s - loss: 0.0281 - val_loss: 0.0140
Epoch 476/500
 - 0s - loss: 0.0281 - val_loss: 0.0140
Epoch 477/500
 - 0s - loss: 0.0281 - val_loss: 0.0140
Epoch 478/500
 - 0s - loss: 0.0281 - val_loss: 0.0139
Epoch 479/500
 - 0s - loss: 0.0281 - val_loss: 0.0139
Epoch 480/500
 - 0s - loss: 0.0281 - val_loss: 0.0139
Epoch 481/500
 - 0s - loss: 0.0280 - val_loss: 0.0138
Epoch 482/500
 - 0s - loss: 0.0281 - val_loss: 0.0139
Epoch 483/500
 - 0s - loss: 0.0280 - val_loss: 0.0138
Epoch 484/500
 - 0s - loss: 0.0280 - val_loss: 0.0138
Epoch 485/500
 - 0s - loss: 0.0280 - val_loss: 0.0138
Epoch 486/500
 - 0s - loss: 0.0280 - val_loss: 0.0138
Epoch 487/500
 - 0s - loss: 0.0280 - val_loss: 0.0137
Epoch 488/500
 - 0s - loss: 0.0280 - val_loss: 0.0138
Epoch 489/500
 - 0s - loss: 0.0279 - val_loss: 0.0136
Epoch 490/500
 - 0s - loss: 0.0280 - val_loss: 0.0137
Epoch 491/500
 - 0s - loss: 0.0279 - val_loss: 0.0136
Epoch 492/500
 - 0s - loss: 0.0279 - val_loss: 0.0135
Epoch 493/500
 - 0s - loss: 0.0279 - val_loss: 0.0136
Epoch 494/500
 - 0s - loss: 0.0279 - val_loss: 0.0135
Epoch 495/500
 - 0s - loss: 0.0279 - val_loss: 0.0135
Epoch 496/500
 - 0s - loss: 0.0279 - val_loss: 0.0135
Epoch 497/500
 - 0s - loss: 0.0279 - val_loss: 0.0134
Epoch 498/500
 - 0s - loss: 0.0278 - val_loss: 0.0134
Epoch 499/500
 - 0s - loss: 0.0278 - val_loss: 0.0134
Epoch 500/500
 - 0s - loss: 0.0278 - val_loss: 0.0134
In [55]:
pyplot.plot(history['loss'], label='train')
pyplot.plot(history['val_loss'], label='validation')
pyplot.legend()
pyplot.show()
In [56]:
# make a prediction
%load_ext autoreload
%autoreload 2
import models
inv_yhat, inv_y, rmse=models.make_lstm_prediction(validation_X,validation_y,model,scaler)
print('LSTM Model on Validation Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
LSTM Model on Validation Data RMSE: 9.969
In [57]:
%load_ext autoreload
%autoreload 2
import visuals

visuals.plot_series_to_compare(inv_y,inv_yhat,"Actual Price","Predicted Price", "Actual Price Versus LSTM Predicted Price")
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload

7. Bench Mark Model

In this section we will check our bench mark model. As is proposed in my proposal my bench mark model is a simple linear regressor model.

Load the preprocessed data

In [34]:
from pandas import read_csv
from pandas import datetime
from pandas import DataFrame
from pandas import concat
from matplotlib import pyplot
from sklearn.metrics import mean_squared_error
from math import sqrt


# Create lagged dataset
values = pd.DataFrame(df_weekly["Settle"].values)
df_benchmark  = concat([values.shift(1), values], axis=1)
df_benchmark.columns = ['t', 't+1']
display(df_benchmark.head(5))
t t+1
0 NaN 235.50
1 235.50 228.25
2 228.25 235.50
3 235.50 241.00
4 241.00 253.50
In [35]:
# split into train , validation and test sets
X = df_benchmark.values
train, validation, test = X[1:validation_start], X[validation_start:testing_start],X[testing_start:]
train_bench_X, train_bench_y = train[:,0], train[:,1]
validation_bench_X, validation_bench_y = validation[:,0], validation[:,1]
test_bench_X, test_bench_y = test[:,0], test[:,1]
In [36]:
%load_ext autoreload
%autoreload 2
import models
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
In [37]:
# make a prediction
%load_ext autoreload
%autoreload 2
import models
predictions,rmse=models.make_benchmark_model_prediction(validation_bench_X,validation_bench_y)
print('Benchmark Model on Validation Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
Benchmark Model on Validation Data RMSE: 8.750
In [38]:
%load_ext autoreload
%autoreload 2
import visuals

visuals.plot_series_to_compare(validation_bench_y,predictions,"Actual Price","Predicted Price", "Actual Price Versus Benchmark Model Predicted Price")
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload

8. Test model on unseen data

Test LSTM model on unseen data

In [58]:
# make a prediction
%load_ext autoreload
%autoreload 2
import models
inv_yhat, inv_y, rmse=models.make_lstm_prediction(test_X,test_y,model,scaler)
print('LSTM Moddel on Test Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
LSTM Moddel on Test Data RMSE: 10.731

Test Benchmark model on unseen data

In [40]:
# make a prediction
%load_ext autoreload
%autoreload 2
import models
predictions,rmse=models.make_benchmark_model_prediction(test_bench_X,test_bench_y)
print('Benchmark Model on Test Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
Benchmark Model on Test Data RMSE: 8.293

9. Tune basic LSTM Model

In [41]:
%load_ext autoreload
%autoreload 2
import tune_model
tune_model.tune_memmory_cells(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
>1/5 param=1.000000, loss=0.011358
>2/5 param=1.000000, loss=0.011374
>3/5 param=1.000000, loss=0.011236
>4/5 param=1.000000, loss=0.010630
>5/5 param=1.000000, loss=0.012073
>1/5 param=5.000000, loss=0.011070
>2/5 param=5.000000, loss=0.011885
>3/5 param=5.000000, loss=0.011132
>4/5 param=5.000000, loss=0.011481
>5/5 param=5.000000, loss=0.012153
>1/5 param=10.000000, loss=0.012393
>2/5 param=10.000000, loss=0.012479
>3/5 param=10.000000, loss=0.011457
>4/5 param=10.000000, loss=0.011210
>5/5 param=10.000000, loss=0.013048
>1/5 param=25.000000, loss=0.011630
>2/5 param=25.000000, loss=0.011377
>3/5 param=25.000000, loss=0.012358
>4/5 param=25.000000, loss=0.011886
>5/5 param=25.000000, loss=0.011421
>1/5 param=50.000000, loss=0.012121
>2/5 param=50.000000, loss=0.013164
>3/5 param=50.000000, loss=0.012114
>4/5 param=50.000000, loss=0.011270
>5/5 param=50.000000, loss=0.012579
>1/5 param=100.000000, loss=0.012769
>2/5 param=100.000000, loss=0.011105
>3/5 param=100.000000, loss=0.013082
>4/5 param=100.000000, loss=0.013613
>5/5 param=100.000000, loss=0.011828
>1/5 param=200.000000, loss=0.012533
>2/5 param=200.000000, loss=0.012772
>3/5 param=200.000000, loss=0.014684
>4/5 param=200.000000, loss=0.012228
>5/5 param=200.000000, loss=0.015286
              1         5        10        25        50       100       200
count  5.000000  5.000000  5.000000  5.000000  5.000000  5.000000  5.000000
mean   0.011334  0.011544  0.012117  0.011735  0.012250  0.012479  0.013501
std    0.000513  0.000470  0.000764  0.000403  0.000696  0.001006  0.001385
min    0.010630  0.011070  0.011210  0.011377  0.011270  0.011105  0.012228
25%    0.011236  0.011132  0.011457  0.011421  0.012114  0.011828  0.012533
50%    0.011358  0.011481  0.012393  0.011630  0.012121  0.012769  0.012772
75%    0.011374  0.011885  0.012479  0.011886  0.012579  0.013082  0.014684
max    0.012073  0.012153  0.013048  0.012358  0.013164  0.013613  0.015286
In [42]:
%load_ext autoreload
%autoreload 2
import tune_model
tune_model.tune_batch_size(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
>1/5 param=2.000000, loss=0.017831
>2/5 param=2.000000, loss=0.018300
>3/5 param=2.000000, loss=0.016560
>4/5 param=2.000000, loss=0.015809
>5/5 param=2.000000, loss=0.018931
>1/5 param=4.000000, loss=0.011840
>2/5 param=4.000000, loss=0.013485
>3/5 param=4.000000, loss=0.012221
>4/5 param=4.000000, loss=0.011967
>5/5 param=4.000000, loss=0.011737
>1/5 param=8.000000, loss=0.016118
>2/5 param=8.000000, loss=0.015007
>3/5 param=8.000000, loss=0.015883
>4/5 param=8.000000, loss=0.015742
>5/5 param=8.000000, loss=0.015618
>1/5 param=32.000000, loss=0.011333
>2/5 param=32.000000, loss=0.011286
>3/5 param=32.000000, loss=0.013534
>4/5 param=32.000000, loss=0.011341
>5/5 param=32.000000, loss=0.011329
>1/5 param=64.000000, loss=0.012469
>2/5 param=64.000000, loss=0.012642
>3/5 param=64.000000, loss=0.011566
>4/5 param=64.000000, loss=0.012310
>5/5 param=64.000000, loss=0.011647
>1/5 param=128.000000, loss=0.011784
>2/5 param=128.000000, loss=0.012318
>3/5 param=128.000000, loss=0.011870
>4/5 param=128.000000, loss=0.012462
>5/5 param=128.000000, loss=0.011698
>1/5 param=256.000000, loss=0.011950
>2/5 param=256.000000, loss=0.012359
>3/5 param=256.000000, loss=0.011391
>4/5 param=256.000000, loss=0.013040
>5/5 param=256.000000, loss=0.011791
              2         4         8        32        64       128       256
count  5.000000  5.000000  5.000000  5.000000  5.000000  5.000000  5.000000
mean   0.017486  0.012250  0.015674  0.011765  0.012127  0.012026  0.012106
std    0.001279  0.000714  0.000416  0.000989  0.000490  0.000341  0.000627
min    0.015809  0.011737  0.015007  0.011286  0.011566  0.011698  0.011391
25%    0.016560  0.011840  0.015618  0.011329  0.011647  0.011784  0.011791
50%    0.017831  0.011967  0.015742  0.011333  0.012310  0.011870  0.011950
75%    0.018300  0.012221  0.015883  0.011341  0.012469  0.012318  0.012359
max    0.018931  0.013485  0.016118  0.013534  0.012642  0.012462  0.013040
In [43]:
%load_ext autoreload
%autoreload 2
import tune_model
tune_model.tune_learning_rate(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
>1/5 param=0.100000, loss=0.018710
>2/5 param=0.100000, loss=0.018996
>3/5 param=0.100000, loss=0.012239
>4/5 param=0.100000, loss=0.016884
>5/5 param=0.100000, loss=0.034545
>1/5 param=0.001000, loss=0.011974
>2/5 param=0.001000, loss=0.012450
>3/5 param=0.001000, loss=0.011630
>4/5 param=0.001000, loss=0.012381
>5/5 param=0.001000, loss=0.011430
>1/5 param=0.000100, loss=0.061702
>2/5 param=0.000100, loss=0.049063
>3/5 param=0.000100, loss=0.030822
>4/5 param=0.000100, loss=0.037265
>5/5 param=0.000100, loss=0.049900
            0.1     0.001    0.0001
count  5.000000  5.000000  5.000000
mean   0.020275  0.011973  0.045750
std    0.008423  0.000449  0.012016
min    0.012239  0.011430  0.030822
25%    0.016884  0.011630  0.037265
50%    0.018710  0.011974  0.049063
75%    0.018996  0.012381  0.049900
max    0.034545  0.012450  0.061702
In [44]:
%load_ext autoreload
%autoreload 2
import tune_model
tune_model.tune_weight_regularization(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
>1/5 param=1.000000, loss=0.017821
>2/5 param=1.000000, loss=0.018784
>3/5 param=1.000000, loss=0.019011
>4/5 param=1.000000, loss=0.018409
>5/5 param=1.000000, loss=0.018491
>1/5 param=2.000000, loss=0.036415
>2/5 param=2.000000, loss=0.037353
>3/5 param=2.000000, loss=0.035984
>4/5 param=2.000000, loss=0.036259
>5/5 param=2.000000, loss=0.035325
>1/5 param=3.000000, loss=0.012934
>2/5 param=3.000000, loss=0.011526
>3/5 param=3.000000, loss=0.011695
>4/5 param=3.000000, loss=0.011955
>5/5 param=3.000000, loss=0.011743
>1/5 param=4.000000, loss=0.037054
>2/5 param=4.000000, loss=0.038430
>3/5 param=4.000000, loss=0.040168
>4/5 param=4.000000, loss=0.038008
>5/5 param=4.000000, loss=0.037282
              1         2         3         4
count  5.000000  5.000000  5.000000  5.000000
mean   0.018503  0.036267  0.011971  0.038188
std    0.000450  0.000736  0.000560  0.001237
min    0.017821  0.035325  0.011526  0.037054
25%    0.018409  0.035984  0.011695  0.037282
50%    0.018491  0.036259  0.011743  0.038008
75%    0.018784  0.036415  0.011955  0.038430
max    0.019011  0.037353  0.012934  0.040168

10. Test Improved LSTM Model

In [59]:
%load_ext autoreload
%autoreload 2
import models
model,history=models.improved_lstm_model(train_X,train_y,validation_X,validation_y)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
Train on 550 samples, validate on 53 samples
Epoch 1/500
 - 14s - loss: 1.0657 - val_loss: 0.7597
Epoch 2/500
 - 0s - loss: 0.9254 - val_loss: 0.8572
Epoch 3/500
 - 0s - loss: 0.8650 - val_loss: 0.8579
Epoch 4/500
 - 0s - loss: 0.8333 - val_loss: 0.8231
Epoch 5/500
 - 0s - loss: 0.8031 - val_loss: 0.7867
Epoch 6/500
 - 0s - loss: 0.7738 - val_loss: 0.7542
Epoch 7/500
 - 0s - loss: 0.7450 - val_loss: 0.7255
Epoch 8/500
 - 0s - loss: 0.7165 - val_loss: 0.6979
Epoch 9/500
 - 0s - loss: 0.6887 - val_loss: 0.6706
Epoch 10/500
 - 0s - loss: 0.6611 - val_loss: 0.6434
Epoch 11/500
 - 0s - loss: 0.6338 - val_loss: 0.6175
Epoch 12/500
 - 0s - loss: 0.6066 - val_loss: 0.5901
Epoch 13/500
 - 0s - loss: 0.5798 - val_loss: 0.5626
Epoch 14/500
 - 0s - loss: 0.5528 - val_loss: 0.5351
Epoch 15/500
 - 0s - loss: 0.5260 - val_loss: 0.5074
Epoch 16/500
 - 0s - loss: 0.4988 - val_loss: 0.4786
Epoch 17/500
 - 0s - loss: 0.4719 - val_loss: 0.4505
Epoch 18/500
 - 0s - loss: 0.4442 - val_loss: 0.4169
Epoch 19/500
 - 0s - loss: 0.4183 - val_loss: 0.3847
Epoch 20/500
 - 0s - loss: 0.3933 - val_loss: 0.3568
Epoch 21/500
 - 0s - loss: 0.3703 - val_loss: 0.3403
Epoch 22/500
 - 0s - loss: 0.3522 - val_loss: 0.3263
Epoch 23/500
 - 0s - loss: 0.3365 - val_loss: 0.3124
Epoch 24/500
 - 0s - loss: 0.3189 - val_loss: 0.2972
Epoch 25/500
 - 0s - loss: 0.3041 - val_loss: 0.2834
Epoch 26/500
 - 0s - loss: 0.2909 - val_loss: 0.2700
Epoch 27/500
 - 0s - loss: 0.2785 - val_loss: 0.2578
Epoch 28/500
 - 0s - loss: 0.2666 - val_loss: 0.2460
Epoch 29/500
 - 0s - loss: 0.2554 - val_loss: 0.2346
Epoch 30/500
 - 0s - loss: 0.2445 - val_loss: 0.2236
Epoch 31/500
 - 0s - loss: 0.2340 - val_loss: 0.2135
Epoch 32/500
 - 0s - loss: 0.2239 - val_loss: 0.2041
Epoch 33/500
 - 0s - loss: 0.2143 - val_loss: 0.1945
Epoch 34/500
 - 0s - loss: 0.2052 - val_loss: 0.1852
Epoch 35/500
 - 0s - loss: 0.1961 - val_loss: 0.1767
Epoch 36/500
 - 0s - loss: 0.1875 - val_loss: 0.1682
Epoch 37/500
 - 0s - loss: 0.1793 - val_loss: 0.1603
Epoch 38/500
 - 0s - loss: 0.1716 - val_loss: 0.1522
Epoch 39/500
 - 0s - loss: 0.1640 - val_loss: 0.1453
Epoch 40/500
 - 0s - loss: 0.1567 - val_loss: 0.1382
Epoch 41/500
 - 0s - loss: 0.1497 - val_loss: 0.1313
Epoch 42/500
 - 0s - loss: 0.1431 - val_loss: 0.1251
Epoch 43/500
 - 0s - loss: 0.1369 - val_loss: 0.1186
Epoch 44/500
 - 0s - loss: 0.1310 - val_loss: 0.1130
Epoch 45/500
 - 0s - loss: 0.1253 - val_loss: 0.1069
Epoch 46/500
 - 0s - loss: 0.1195 - val_loss: 0.1013
Epoch 47/500
 - 0s - loss: 0.1142 - val_loss: 0.0965
Epoch 48/500
 - 0s - loss: 0.1091 - val_loss: 0.0925
Epoch 49/500
 - 0s - loss: 0.1052 - val_loss: 0.0886
Epoch 50/500
 - 0s - loss: 0.1013 - val_loss: 0.0828
Epoch 51/500
 - 0s - loss: 0.0964 - val_loss: 0.0786
Epoch 52/500
 - 0s - loss: 0.0921 - val_loss: 0.0744
Epoch 53/500
 - 0s - loss: 0.0880 - val_loss: 0.0703
Epoch 54/500
 - 0s - loss: 0.0840 - val_loss: 0.0694
Epoch 55/500
 - 0s - loss: 0.0823 - val_loss: 0.0659
Epoch 56/500
 - 0s - loss: 0.0801 - val_loss: 0.0610
Epoch 57/500
 - 0s - loss: 0.0761 - val_loss: 0.0588
Epoch 58/500
 - 0s - loss: 0.0751 - val_loss: 0.0561
Epoch 59/500
 - 0s - loss: 0.0744 - val_loss: 0.0519
Epoch 60/500
 - 0s - loss: 0.0673 - val_loss: 0.0552
Epoch 61/500
 - 0s - loss: 0.0734 - val_loss: 0.0493
Epoch 62/500
 - 0s - loss: 0.0673 - val_loss: 0.0457
Epoch 63/500
 - 0s - loss: 0.0620 - val_loss: 0.0429
Epoch 64/500
 - 0s - loss: 0.0634 - val_loss: 0.0432
Epoch 65/500
 - 0s - loss: 0.0560 - val_loss: 0.0439
Epoch 66/500
 - 0s - loss: 0.0645 - val_loss: 0.0451
Epoch 67/500
 - 0s - loss: 0.0560 - val_loss: 0.0370
Epoch 68/500
 - 0s - loss: 0.0532 - val_loss: 0.0338
Epoch 69/500
 - 0s - loss: 0.0543 - val_loss: 0.0352
Epoch 70/500
 - 0s - loss: 0.0483 - val_loss: 0.0348
Epoch 71/500
 - 0s - loss: 0.0568 - val_loss: 0.0396
Epoch 72/500
 - 0s - loss: 0.0469 - val_loss: 0.0288
Epoch 73/500
 - 0s - loss: 0.0473 - val_loss: 0.0288
Epoch 74/500
 - 0s - loss: 0.0434 - val_loss: 0.0283
Epoch 75/500
 - 0s - loss: 0.0428 - val_loss: 0.0227
Epoch 76/500
 - 0s - loss: 0.0472 - val_loss: 0.0289
Epoch 77/500
 - 0s - loss: 0.0399 - val_loss: 0.0232
Epoch 78/500
 - 0s - loss: 0.0427 - val_loss: 0.0251
Epoch 79/500
 - 0s - loss: 0.0380 - val_loss: 0.0244
Epoch 80/500
 - 0s - loss: 0.0414 - val_loss: 0.0208
Epoch 81/500
 - 0s - loss: 0.0392 - val_loss: 0.0207
Epoch 82/500
 - 0s - loss: 0.0362 - val_loss: 0.0187
Epoch 83/500
 - 0s - loss: 0.0366 - val_loss: 0.0198
Epoch 84/500
 - 0s - loss: 0.0339 - val_loss: 0.0204
Epoch 85/500
 - 0s - loss: 0.0392 - val_loss: 0.0192
Epoch 86/500
 - 0s - loss: 0.0342 - val_loss: 0.0172
Epoch 87/500
 - 0s - loss: 0.0348 - val_loss: 0.0172
Epoch 88/500
 - 0s - loss: 0.0337 - val_loss: 0.0182
Epoch 89/500
 - 0s - loss: 0.0317 - val_loss: 0.0170
Epoch 90/500
 - 0s - loss: 0.0376 - val_loss: 0.0187
Epoch 91/500
 - 0s - loss: 0.0320 - val_loss: 0.0161
Epoch 92/500
 - 0s - loss: 0.0345 - val_loss: 0.0171
Epoch 93/500
 - 0s - loss: 0.0314 - val_loss: 0.0176
Epoch 94/500
 - 0s - loss: 0.0320 - val_loss: 0.0129
Epoch 95/500
 - 0s - loss: 0.0348 - val_loss: 0.0162
Epoch 96/500
 - 0s - loss: 0.0307 - val_loss: 0.0141
Epoch 97/500
 - 0s - loss: 0.0335 - val_loss: 0.0176
Epoch 98/500
 - 0s - loss: 0.0297 - val_loss: 0.0165
Epoch 99/500
 - 0s - loss: 0.0347 - val_loss: 0.0157
Epoch 100/500
 - 0s - loss: 0.0309 - val_loss: 0.0136
Epoch 101/500
 - 0s - loss: 0.0311 - val_loss: 0.0141
Epoch 102/500
 - 0s - loss: 0.0307 - val_loss: 0.0159
Epoch 103/500
 - 0s - loss: 0.0288 - val_loss: 0.0137
Epoch 104/500
 - 0s - loss: 0.0344 - val_loss: 0.0161
Epoch 105/500
 - 0s - loss: 0.0294 - val_loss: 0.0136
Epoch 106/500
 - 0s - loss: 0.0322 - val_loss: 0.0154
Epoch 107/500
 - 0s - loss: 0.0295 - val_loss: 0.0168
Epoch 108/500
 - 0s - loss: 0.0309 - val_loss: 0.0113
Epoch 109/500
 - 0s - loss: 0.0326 - val_loss: 0.0142
Epoch 110/500
 - 0s - loss: 0.0290 - val_loss: 0.0126
Epoch 111/500
 - 0s - loss: 0.0298 - val_loss: 0.0139
Epoch 112/500
 - 0s - loss: 0.0281 - val_loss: 0.0145
Epoch 113/500
 - 0s - loss: 0.0329 - val_loss: 0.0144
Epoch 114/500
 - 0s - loss: 0.0290 - val_loss: 0.0126
Epoch 115/500
 - 0s - loss: 0.0304 - val_loss: 0.0136
Epoch 116/500
 - 0s - loss: 0.0297 - val_loss: 0.0162
Epoch 117/500
 - 0s - loss: 0.0289 - val_loss: 0.0110
Epoch 118/500
 - 0s - loss: 0.0319 - val_loss: 0.0140
Epoch 119/500
 - 0s - loss: 0.0282 - val_loss: 0.0126
Epoch 120/500
 - 0s - loss: 0.0305 - val_loss: 0.0146
Epoch 121/500
 - 0s - loss: 0.0282 - val_loss: 0.0154
Epoch 122/500
 - 0s - loss: 0.0308 - val_loss: 0.0114
Epoch 123/500
 - 0s - loss: 0.0303 - val_loss: 0.0130
Epoch 124/500
 - 0s - loss: 0.0287 - val_loss: 0.0126
Epoch 125/500
 - 0s - loss: 0.0291 - val_loss: 0.0140
Epoch 126/500
 - 0s - loss: 0.0275 - val_loss: 0.0128
Epoch 127/500
 - 0s - loss: 0.0323 - val_loss: 0.0141
Epoch 128/500
 - 0s - loss: 0.0286 - val_loss: 0.0125
Epoch 129/500
 - 0s - loss: 0.0302 - val_loss: 0.0140
Epoch 130/500
 - 0s - loss: 0.0287 - val_loss: 0.0154
Epoch 131/500
 - 0s - loss: 0.0296 - val_loss: 0.0108
Epoch 132/500
 - 0s - loss: 0.0316 - val_loss: 0.0133
Epoch 133/500
 - 0s - loss: 0.0283 - val_loss: 0.0123
Epoch 134/500
 - 0s - loss: 0.0295 - val_loss: 0.0138
Epoch 135/500
 - 0s - loss: 0.0276 - val_loss: 0.0140
Epoch 136/500
 - 0s - loss: 0.0323 - val_loss: 0.0141
Epoch 137/500
 - 0s - loss: 0.0287 - val_loss: 0.0125
Epoch 138/500
 - 0s - loss: 0.0298 - val_loss: 0.0136
Epoch 139/500
 - 0s - loss: 0.0283 - val_loss: 0.0145
Epoch 140/500
 - 0s - loss: 0.0287 - val_loss: 0.0109
Epoch 141/500
 - 0s - loss: 0.0314 - val_loss: 0.0134
Epoch 142/500
 - 0s - loss: 0.0279 - val_loss: 0.0120
Epoch 143/500
 - 0s - loss: 0.0297 - val_loss: 0.0143
Epoch 144/500
 - 0s - loss: 0.0275 - val_loss: 0.0136
Epoch 145/500
 - 0s - loss: 0.0310 - val_loss: 0.0119
Epoch 146/500
 - 0s - loss: 0.0286 - val_loss: 0.0119
Epoch 147/500
 - 0s - loss: 0.0286 - val_loss: 0.0126
Epoch 148/500
 - 0s - loss: 0.0292 - val_loss: 0.0151
Epoch 149/500
 - 0s - loss: 0.0277 - val_loss: 0.0116
Epoch 150/500
 - 0s - loss: 0.0308 - val_loss: 0.0121
Epoch 151/500
 - 0s - loss: 0.0278 - val_loss: 0.0121
Epoch 152/500
 - 0s - loss: 0.0295 - val_loss: 0.0137
Epoch 153/500
 - 0s - loss: 0.0276 - val_loss: 0.0140
Epoch 154/500
 - 0s - loss: 0.0291 - val_loss: 0.0108
Epoch 155/500
 - 0s - loss: 0.0309 - val_loss: 0.0128
Epoch 156/500
 - 0s - loss: 0.0282 - val_loss: 0.0123
Epoch 157/500
 - 0s - loss: 0.0295 - val_loss: 0.0139
Epoch 158/500
 - 0s - loss: 0.0274 - val_loss: 0.0141
Epoch 159/500
 - 0s - loss: 0.0322 - val_loss: 0.0134
Epoch 160/500
 - 0s - loss: 0.0285 - val_loss: 0.0121
Epoch 161/500
 - 0s - loss: 0.0296 - val_loss: 0.0138
Epoch 162/500
 - 0s - loss: 0.0279 - val_loss: 0.0144
Epoch 163/500
 - 0s - loss: 0.0288 - val_loss: 0.0108
Epoch 164/500
 - 0s - loss: 0.0311 - val_loss: 0.0129
Epoch 165/500
 - 0s - loss: 0.0278 - val_loss: 0.0120
Epoch 166/500
 - 0s - loss: 0.0301 - val_loss: 0.0143
Epoch 167/500
 - 0s - loss: 0.0276 - val_loss: 0.0140
Epoch 168/500
 - 0s - loss: 0.0321 - val_loss: 0.0137
Epoch 169/500
 - 0s - loss: 0.0284 - val_loss: 0.0119
Epoch 170/500
 - 0s - loss: 0.0288 - val_loss: 0.0127
Epoch 171/500
 - 0s - loss: 0.0284 - val_loss: 0.0147
Epoch 172/500
 - 0s - loss: 0.0282 - val_loss: 0.0110
Epoch 173/500
 - 0s - loss: 0.0311 - val_loss: 0.0126
Epoch 174/500
 - 0s - loss: 0.0277 - val_loss: 0.0122
Epoch 175/500
 - 0s - loss: 0.0301 - val_loss: 0.0147
Epoch 176/500
 - 0s - loss: 0.0275 - val_loss: 0.0143
Epoch 177/500
 - 0s - loss: 0.0305 - val_loss: 0.0115
Epoch 178/500
 - 0s - loss: 0.0305 - val_loss: 0.0130
Epoch 179/500
 - 0s - loss: 0.0285 - val_loss: 0.0125
Epoch 180/500
 - 0s - loss: 0.0295 - val_loss: 0.0152
Epoch 181/500
 - 0s - loss: 0.0276 - val_loss: 0.0122
Epoch 182/500
 - 0s - loss: 0.0320 - val_loss: 0.0139
Epoch 183/500
 - 0s - loss: 0.0281 - val_loss: 0.0121
Epoch 184/500
 - 0s - loss: 0.0295 - val_loss: 0.0138
Epoch 185/500
 - 0s - loss: 0.0278 - val_loss: 0.0148
Epoch 186/500
 - 0s - loss: 0.0301 - val_loss: 0.0110
Epoch 187/500
 - 0s - loss: 0.0313 - val_loss: 0.0129
Epoch 188/500
 - 0s - loss: 0.0280 - val_loss: 0.0123
Epoch 189/500
 - 0s - loss: 0.0295 - val_loss: 0.0144
Epoch 190/500
 - 0s - loss: 0.0276 - val_loss: 0.0132
Epoch 191/500
 - 0s - loss: 0.0324 - val_loss: 0.0150
Epoch 192/500
 - 0s - loss: 0.0280 - val_loss: 0.0119
Epoch 193/500
 - 0s - loss: 0.0290 - val_loss: 0.0136
Epoch 194/500
 - 0s - loss: 0.0275 - val_loss: 0.0143
Epoch 195/500
 - 0s - loss: 0.0295 - val_loss: 0.0109
Epoch 196/500
 - 0s - loss: 0.0303 - val_loss: 0.0125
Epoch 197/500
 - 0s - loss: 0.0277 - val_loss: 0.0123
Epoch 198/500
 - 0s - loss: 0.0293 - val_loss: 0.0144
Epoch 199/500
 - 0s - loss: 0.0276 - val_loss: 0.0128
Epoch 200/500
 - 0s - loss: 0.0321 - val_loss: 0.0143
Epoch 201/500
 - 0s - loss: 0.0282 - val_loss: 0.0120
Epoch 202/500
 - 0s - loss: 0.0295 - val_loss: 0.0140
Epoch 203/500
 - 0s - loss: 0.0277 - val_loss: 0.0142
Epoch 204/500
 - 0s - loss: 0.0294 - val_loss: 0.0108
Epoch 205/500
 - 0s - loss: 0.0309 - val_loss: 0.0129
Epoch 206/500
 - 0s - loss: 0.0279 - val_loss: 0.0121
Epoch 207/500
 - 0s - loss: 0.0295 - val_loss: 0.0148
Epoch 208/500
 - 0s - loss: 0.0276 - val_loss: 0.0126
Epoch 209/500
 - 0s - loss: 0.0326 - val_loss: 0.0151
Epoch 210/500
 - 0s - loss: 0.0285 - val_loss: 0.0123
Epoch 211/500
 - 0s - loss: 0.0303 - val_loss: 0.0154
Epoch 212/500
 - 0s - loss: 0.0282 - val_loss: 0.0152
Epoch 213/500
 - 0s - loss: 0.0297 - val_loss: 0.0109
Epoch 214/500
 - 0s - loss: 0.0312 - val_loss: 0.0129
Epoch 215/500
 - 0s - loss: 0.0279 - val_loss: 0.0122
Epoch 216/500
 - 0s - loss: 0.0294 - val_loss: 0.0153
Epoch 217/500
 - 0s - loss: 0.0274 - val_loss: 0.0120
Epoch 218/500
 - 0s - loss: 0.0315 - val_loss: 0.0137
Epoch 219/500
 - 0s - loss: 0.0279 - val_loss: 0.0120
Epoch 220/500
 - 0s - loss: 0.0295 - val_loss: 0.0143
Epoch 221/500
 - 0s - loss: 0.0277 - val_loss: 0.0145
Epoch 222/500
 - 0s - loss: 0.0301 - val_loss: 0.0115
Epoch 223/500
 - 0s - loss: 0.0298 - val_loss: 0.0122
Epoch 224/500
 - 0s - loss: 0.0276 - val_loss: 0.0125
Epoch 225/500
 - 0s - loss: 0.0288 - val_loss: 0.0160
Epoch 226/500
 - 0s - loss: 0.0273 - val_loss: 0.0117
Epoch 227/500
 - 0s - loss: 0.0298 - val_loss: 0.0119
Epoch 228/500
 - 0s - loss: 0.0274 - val_loss: 0.0121
Epoch 229/500
 - 0s - loss: 0.0291 - val_loss: 0.0138
Epoch 230/500
 - 0s - loss: 0.0273 - val_loss: 0.0141
Epoch 231/500
 - 0s - loss: 0.0298 - val_loss: 0.0111
Epoch 232/500
 - 0s - loss: 0.0298 - val_loss: 0.0124
Epoch 233/500
 - 0s - loss: 0.0280 - val_loss: 0.0124
Epoch 234/500
 - 0s - loss: 0.0290 - val_loss: 0.0158
Epoch 235/500
 - 0s - loss: 0.0276 - val_loss: 0.0113
Epoch 236/500
 - 0s - loss: 0.0299 - val_loss: 0.0118
Epoch 237/500
 - 0s - loss: 0.0275 - val_loss: 0.0121
Epoch 238/500
 - 0s - loss: 0.0291 - val_loss: 0.0137
Epoch 239/500
 - 0s - loss: 0.0272 - val_loss: 0.0138
Epoch 240/500
 - 0s - loss: 0.0300 - val_loss: 0.0112
Epoch 241/500
 - 0s - loss: 0.0296 - val_loss: 0.0126
Epoch 242/500
 - 0s - loss: 0.0283 - val_loss: 0.0126
Epoch 243/500
 - 0s - loss: 0.0292 - val_loss: 0.0154
Epoch 244/500
 - 0s - loss: 0.0278 - val_loss: 0.0113
Epoch 245/500
 - 0s - loss: 0.0308 - val_loss: 0.0123
Epoch 246/500
 - 0s - loss: 0.0276 - val_loss: 0.0120
Epoch 247/500
 - 0s - loss: 0.0297 - val_loss: 0.0144
Epoch 248/500
 - 0s - loss: 0.0270 - val_loss: 0.0134
Epoch 249/500
 - 0s - loss: 0.0302 - val_loss: 0.0115
Epoch 250/500
 - 0s - loss: 0.0293 - val_loss: 0.0122
Epoch 251/500
 - 0s - loss: 0.0281 - val_loss: 0.0126
Epoch 252/500
 - 0s - loss: 0.0293 - val_loss: 0.0157
Epoch 253/500
 - 0s - loss: 0.0279 - val_loss: 0.0113
Epoch 254/500
 - 0s - loss: 0.0308 - val_loss: 0.0124
Epoch 255/500
 - 0s - loss: 0.0275 - val_loss: 0.0121
Epoch 256/500
 - 0s - loss: 0.0298 - val_loss: 0.0146
Epoch 257/500
 - 0s - loss: 0.0274 - val_loss: 0.0140
Epoch 258/500
 - 0s - loss: 0.0307 - val_loss: 0.0121
Epoch 259/500
 - 0s - loss: 0.0289 - val_loss: 0.0120
Epoch 260/500
 - 0s - loss: 0.0280 - val_loss: 0.0126
Epoch 261/500
 - 0s - loss: 0.0292 - val_loss: 0.0159
Epoch 262/500
 - 0s - loss: 0.0278 - val_loss: 0.0111
Epoch 263/500
 - 0s - loss: 0.0308 - val_loss: 0.0127
Epoch 264/500
 - 0s - loss: 0.0274 - val_loss: 0.0119
Epoch 265/500
 - 0s - loss: 0.0293 - val_loss: 0.0142
Epoch 266/500
 - 0s - loss: 0.0270 - val_loss: 0.0135
Epoch 267/500
 - 0s - loss: 0.0301 - val_loss: 0.0117
Epoch 268/500
 - 0s - loss: 0.0290 - val_loss: 0.0121
Epoch 269/500
 - 0s - loss: 0.0279 - val_loss: 0.0126
Epoch 270/500
 - 0s - loss: 0.0290 - val_loss: 0.0157
Epoch 271/500
 - 0s - loss: 0.0278 - val_loss: 0.0112
Epoch 272/500
 - 0s - loss: 0.0307 - val_loss: 0.0126
Epoch 273/500
 - 0s - loss: 0.0272 - val_loss: 0.0118
Epoch 274/500
 - 0s - loss: 0.0290 - val_loss: 0.0138
Epoch 275/500
 - 0s - loss: 0.0270 - val_loss: 0.0134
Epoch 276/500
 - 0s - loss: 0.0303 - val_loss: 0.0117
Epoch 277/500
 - 0s - loss: 0.0283 - val_loss: 0.0119
Epoch 278/500
 - 0s - loss: 0.0278 - val_loss: 0.0125
Epoch 279/500
 - 0s - loss: 0.0277 - val_loss: 0.0142
Epoch 280/500
 - 0s - loss: 0.0275 - val_loss: 0.0111
Epoch 281/500
 - 0s - loss: 0.0301 - val_loss: 0.0122
Epoch 282/500
 - 0s - loss: 0.0269 - val_loss: 0.0118
Epoch 283/500
 - 0s - loss: 0.0288 - val_loss: 0.0143
Epoch 284/500
 - 0s - loss: 0.0269 - val_loss: 0.0132
Epoch 285/500
 - 0s - loss: 0.0304 - val_loss: 0.0123
Epoch 286/500
 - 0s - loss: 0.0280 - val_loss: 0.0117
Epoch 287/500
 - 0s - loss: 0.0282 - val_loss: 0.0130
Epoch 288/500
 - 0s - loss: 0.0271 - val_loss: 0.0138
Epoch 289/500
 - 0s - loss: 0.0282 - val_loss: 0.0108
Epoch 290/500
 - 0s - loss: 0.0297 - val_loss: 0.0121
Epoch 291/500
 - 0s - loss: 0.0268 - val_loss: 0.0119
Epoch 292/500
 - 0s - loss: 0.0286 - val_loss: 0.0140
Epoch 293/500
 - 0s - loss: 0.0269 - val_loss: 0.0127
Epoch 294/500
 - 0s - loss: 0.0306 - val_loss: 0.0126
Epoch 295/500
 - 0s - loss: 0.0279 - val_loss: 0.0117
Epoch 296/500
 - 0s - loss: 0.0284 - val_loss: 0.0131
Epoch 297/500
 - 0s - loss: 0.0272 - val_loss: 0.0135
Epoch 298/500
 - 0s - loss: 0.0283 - val_loss: 0.0110
Epoch 299/500
 - 0s - loss: 0.0304 - val_loss: 0.0128
Epoch 300/500
 - 0s - loss: 0.0275 - val_loss: 0.0123
Epoch 301/500
 - 0s - loss: 0.0294 - val_loss: 0.0145
Epoch 302/500
 - 0s - loss: 0.0272 - val_loss: 0.0124
Epoch 303/500
 - 0s - loss: 0.0320 - val_loss: 0.0149
Epoch 304/500
 - 0s - loss: 0.0279 - val_loss: 0.0121
Epoch 305/500
 - 0s - loss: 0.0295 - val_loss: 0.0144
Epoch 306/500
 - 0s - loss: 0.0272 - val_loss: 0.0135
Epoch 307/500
 - 0s - loss: 0.0304 - val_loss: 0.0117
Epoch 308/500
 - 0s - loss: 0.0293 - val_loss: 0.0124
Epoch 309/500
 - 0s - loss: 0.0282 - val_loss: 0.0127
Epoch 310/500
 - 0s - loss: 0.0288 - val_loss: 0.0155
Epoch 311/500
 - 0s - loss: 0.0277 - val_loss: 0.0110
Epoch 312/500
 - 0s - loss: 0.0302 - val_loss: 0.0122
Epoch 313/500
 - 0s - loss: 0.0272 - val_loss: 0.0117
Epoch 314/500
 - 0s - loss: 0.0287 - val_loss: 0.0137
Epoch 315/500
 - 0s - loss: 0.0269 - val_loss: 0.0132
Epoch 316/500
 - 0s - loss: 0.0301 - val_loss: 0.0117
Epoch 317/500
 - 0s - loss: 0.0286 - val_loss: 0.0120
Epoch 318/500
 - 0s - loss: 0.0278 - val_loss: 0.0125
Epoch 319/500
 - 0s - loss: 0.0281 - val_loss: 0.0152
Epoch 320/500
 - 0s - loss: 0.0279 - val_loss: 0.0111
Epoch 321/500
 - 0s - loss: 0.0302 - val_loss: 0.0123
Epoch 322/500
 - 0s - loss: 0.0270 - val_loss: 0.0118
Epoch 323/500
 - 0s - loss: 0.0290 - val_loss: 0.0137
Epoch 324/500
 - 0s - loss: 0.0269 - val_loss: 0.0135
Epoch 325/500
 - 0s - loss: 0.0306 - val_loss: 0.0125
Epoch 326/500
 - 0s - loss: 0.0280 - val_loss: 0.0116
Epoch 327/500
 - 0s - loss: 0.0278 - val_loss: 0.0126
Epoch 328/500
 - 0s - loss: 0.0272 - val_loss: 0.0139
Epoch 329/500
 - 0s - loss: 0.0280 - val_loss: 0.0108
Epoch 330/500
 - 0s - loss: 0.0303 - val_loss: 0.0126
Epoch 331/500
 - 0s - loss: 0.0273 - val_loss: 0.0122
Epoch 332/500
 - 0s - loss: 0.0288 - val_loss: 0.0142
Epoch 333/500
 - 0s - loss: 0.0268 - val_loss: 0.0126
Epoch 334/500
 - 0s - loss: 0.0309 - val_loss: 0.0132
Epoch 335/500
 - 0s - loss: 0.0278 - val_loss: 0.0117
Epoch 336/500
 - 0s - loss: 0.0285 - val_loss: 0.0134
Epoch 337/500
 - 0s - loss: 0.0270 - val_loss: 0.0134
Epoch 338/500
 - 0s - loss: 0.0289 - val_loss: 0.0109
Epoch 339/500
 - 0s - loss: 0.0294 - val_loss: 0.0116
Epoch 340/500
 - 0s - loss: 0.0272 - val_loss: 0.0123
Epoch 341/500
 - 0s - loss: 0.0289 - val_loss: 0.0154
Epoch 342/500
 - 0s - loss: 0.0271 - val_loss: 0.0114
Epoch 343/500
 - 0s - loss: 0.0296 - val_loss: 0.0115
Epoch 344/500
 - 0s - loss: 0.0272 - val_loss: 0.0119
Epoch 345/500
 - 0s - loss: 0.0287 - val_loss: 0.0137
Epoch 346/500
 - 0s - loss: 0.0271 - val_loss: 0.0133
Epoch 347/500
 - 0s - loss: 0.0291 - val_loss: 0.0110
Epoch 348/500
 - 0s - loss: 0.0291 - val_loss: 0.0119
Epoch 349/500
 - 0s - loss: 0.0273 - val_loss: 0.0124
Epoch 350/500
 - 0s - loss: 0.0285 - val_loss: 0.0158
Epoch 351/500
 - 0s - loss: 0.0274 - val_loss: 0.0113
Epoch 352/500
 - 0s - loss: 0.0297 - val_loss: 0.0119
Epoch 353/500
 - 0s - loss: 0.0270 - val_loss: 0.0117
Epoch 354/500
 - 0s - loss: 0.0288 - val_loss: 0.0137
Epoch 355/500
 - 0s - loss: 0.0267 - val_loss: 0.0132
Epoch 356/500
 - 0s - loss: 0.0305 - val_loss: 0.0120
Epoch 357/500
 - 0s - loss: 0.0283 - val_loss: 0.0119
Epoch 358/500
 - 0s - loss: 0.0278 - val_loss: 0.0126
Epoch 359/500
 - 0s - loss: 0.0275 - val_loss: 0.0142
Epoch 360/500
 - 0s - loss: 0.0276 - val_loss: 0.0110
Epoch 361/500
 - 0s - loss: 0.0298 - val_loss: 0.0121
Epoch 362/500
 - 0s - loss: 0.0268 - val_loss: 0.0118
Epoch 363/500
 - 0s - loss: 0.0287 - val_loss: 0.0139
Epoch 364/500
 - 0s - loss: 0.0270 - val_loss: 0.0128
Epoch 365/500
 - 0s - loss: 0.0308 - val_loss: 0.0130
Epoch 366/500
 - 0s - loss: 0.0278 - val_loss: 0.0117
Epoch 367/500
 - 0s - loss: 0.0285 - val_loss: 0.0136
Epoch 368/500
 - 0s - loss: 0.0269 - val_loss: 0.0133
Epoch 369/500
 - 0s - loss: 0.0288 - val_loss: 0.0109
Epoch 370/500
 - 0s - loss: 0.0290 - val_loss: 0.0115
Epoch 371/500
 - 0s - loss: 0.0270 - val_loss: 0.0122
Epoch 372/500
 - 0s - loss: 0.0286 - val_loss: 0.0151
Epoch 373/500
 - 0s - loss: 0.0269 - val_loss: 0.0113
Epoch 374/500
 - 0s - loss: 0.0300 - val_loss: 0.0122
Epoch 375/500
 - 0s - loss: 0.0272 - val_loss: 0.0118
Epoch 376/500
 - 0s - loss: 0.0292 - val_loss: 0.0141
Epoch 377/500
 - 0s - loss: 0.0270 - val_loss: 0.0132
Epoch 378/500
 - 0s - loss: 0.0305 - val_loss: 0.0121
Epoch 379/500
 - 0s - loss: 0.0289 - val_loss: 0.0119
Epoch 380/500
 - 0s - loss: 0.0277 - val_loss: 0.0124
Epoch 381/500
 - 0s - loss: 0.0282 - val_loss: 0.0149
Epoch 382/500
 - 0s - loss: 0.0279 - val_loss: 0.0110
Epoch 383/500
 - 0s - loss: 0.0303 - val_loss: 0.0123
Epoch 384/500
 - 0s - loss: 0.0270 - val_loss: 0.0117
Epoch 385/500
 - 0s - loss: 0.0290 - val_loss: 0.0137
Epoch 386/500
 - 0s - loss: 0.0269 - val_loss: 0.0127
Epoch 387/500
 - 0s - loss: 0.0309 - val_loss: 0.0129
Epoch 388/500
 - 0s - loss: 0.0279 - val_loss: 0.0115
Epoch 389/500
 - 0s - loss: 0.0280 - val_loss: 0.0131
Epoch 390/500
 - 0s - loss: 0.0269 - val_loss: 0.0134
Epoch 391/500
 - 0s - loss: 0.0286 - val_loss: 0.0108
Epoch 392/500
 - 0s - loss: 0.0296 - val_loss: 0.0120
Epoch 393/500
 - 0s - loss: 0.0272 - val_loss: 0.0122
Epoch 394/500
 - 0s - loss: 0.0289 - val_loss: 0.0155
Epoch 395/500
 - 0s - loss: 0.0269 - val_loss: 0.0112
Epoch 396/500
 - 0s - loss: 0.0297 - val_loss: 0.0120
Epoch 397/500
 - 0s - loss: 0.0270 - val_loss: 0.0117
Epoch 398/500
 - 0s - loss: 0.0287 - val_loss: 0.0139
Epoch 399/500
 - 0s - loss: 0.0269 - val_loss: 0.0132
Epoch 400/500
 - 0s - loss: 0.0300 - val_loss: 0.0120
Epoch 401/500
 - 0s - loss: 0.0285 - val_loss: 0.0118
Epoch 402/500
 - 0s - loss: 0.0272 - val_loss: 0.0125
Epoch 403/500
 - 0s - loss: 0.0277 - val_loss: 0.0144
Epoch 404/500
 - 0s - loss: 0.0274 - val_loss: 0.0112
Epoch 405/500
 - 0s - loss: 0.0300 - val_loss: 0.0121
Epoch 406/500
 - 0s - loss: 0.0268 - val_loss: 0.0118
Epoch 407/500
 - 0s - loss: 0.0288 - val_loss: 0.0142
Epoch 408/500
 - 0s - loss: 0.0269 - val_loss: 0.0128
Epoch 409/500
 - 0s - loss: 0.0308 - val_loss: 0.0135
Epoch 410/500
 - 0s - loss: 0.0276 - val_loss: 0.0118
Epoch 411/500
 - 0s - loss: 0.0287 - val_loss: 0.0139
Epoch 412/500
 - 0s - loss: 0.0271 - val_loss: 0.0132
Epoch 413/500
 - 0s - loss: 0.0293 - val_loss: 0.0114
Epoch 414/500
 - 0s - loss: 0.0287 - val_loss: 0.0117
Epoch 415/500
 - 0s - loss: 0.0272 - val_loss: 0.0123
Epoch 416/500
 - 0s - loss: 0.0286 - val_loss: 0.0156
Epoch 417/500
 - 0s - loss: 0.0271 - val_loss: 0.0113
Epoch 418/500
 - 0s - loss: 0.0298 - val_loss: 0.0119
Epoch 419/500
 - 0s - loss: 0.0271 - val_loss: 0.0118
Epoch 420/500
 - 0s - loss: 0.0293 - val_loss: 0.0141
Epoch 421/500
 - 0s - loss: 0.0270 - val_loss: 0.0135
Epoch 422/500
 - 0s - loss: 0.0302 - val_loss: 0.0119
Epoch 423/500
 - 0s - loss: 0.0291 - val_loss: 0.0120
Epoch 424/500
 - 0s - loss: 0.0276 - val_loss: 0.0124
Epoch 425/500
 - 0s - loss: 0.0287 - val_loss: 0.0167
Epoch 426/500
 - 0s - loss: 0.0283 - val_loss: 0.0111
Epoch 427/500
 - 0s - loss: 0.0306 - val_loss: 0.0125
Epoch 428/500
 - 0s - loss: 0.0270 - val_loss: 0.0118
Epoch 429/500
 - 0s - loss: 0.0291 - val_loss: 0.0140
Epoch 430/500
 - 0s - loss: 0.0269 - val_loss: 0.0126
Epoch 431/500
 - 0s - loss: 0.0306 - val_loss: 0.0133
Epoch 432/500
 - 0s - loss: 0.0276 - val_loss: 0.0118
Epoch 433/500
 - 0s - loss: 0.0282 - val_loss: 0.0136
Epoch 434/500
 - 0s - loss: 0.0270 - val_loss: 0.0131
Epoch 435/500
 - 0s - loss: 0.0286 - val_loss: 0.0110
Epoch 436/500
 - 0s - loss: 0.0286 - val_loss: 0.0115
Epoch 437/500
 - 0s - loss: 0.0271 - val_loss: 0.0122
Epoch 438/500
 - 0s - loss: 0.0284 - val_loss: 0.0157
Epoch 439/500
 - 0s - loss: 0.0272 - val_loss: 0.0113
Epoch 440/500
 - 0s - loss: 0.0296 - val_loss: 0.0118
Epoch 441/500
 - 0s - loss: 0.0267 - val_loss: 0.0117
Epoch 442/500
 - 0s - loss: 0.0285 - val_loss: 0.0141
Epoch 443/500
 - 0s - loss: 0.0267 - val_loss: 0.0125
Epoch 444/500
 - 0s - loss: 0.0299 - val_loss: 0.0122
Epoch 445/500
 - 0s - loss: 0.0278 - val_loss: 0.0117
Epoch 446/500
 - 0s - loss: 0.0280 - val_loss: 0.0128
Epoch 447/500
 - 0s - loss: 0.0269 - val_loss: 0.0134
Epoch 448/500
 - 0s - loss: 0.0282 - val_loss: 0.0110
Epoch 449/500
 - 0s - loss: 0.0293 - val_loss: 0.0118
Epoch 450/500
 - 0s - loss: 0.0269 - val_loss: 0.0121
Epoch 451/500
 - 0s - loss: 0.0286 - val_loss: 0.0154
Epoch 452/500
 - 0s - loss: 0.0269 - val_loss: 0.0112
Epoch 453/500
 - 0s - loss: 0.0292 - val_loss: 0.0117
Epoch 454/500
 - 0s - loss: 0.0268 - val_loss: 0.0116
Epoch 455/500
 - 0s - loss: 0.0278 - val_loss: 0.0134
Epoch 456/500
 - 0s - loss: 0.0265 - val_loss: 0.0129
Epoch 457/500
 - 0s - loss: 0.0296 - val_loss: 0.0120
Epoch 458/500
 - 0s - loss: 0.0276 - val_loss: 0.0115
Epoch 459/500
 - 0s - loss: 0.0277 - val_loss: 0.0128
Epoch 460/500
 - 0s - loss: 0.0268 - val_loss: 0.0133
Epoch 461/500
 - 0s - loss: 0.0284 - val_loss: 0.0109
Epoch 462/500
 - 0s - loss: 0.0289 - val_loss: 0.0115
Epoch 463/500
 - 0s - loss: 0.0268 - val_loss: 0.0121
Epoch 464/500
 - 0s - loss: 0.0286 - val_loss: 0.0147
Epoch 465/500
 - 0s - loss: 0.0268 - val_loss: 0.0115
Epoch 466/500
 - 0s - loss: 0.0297 - val_loss: 0.0121
Epoch 467/500
 - 0s - loss: 0.0271 - val_loss: 0.0117
Epoch 468/500
 - 0s - loss: 0.0290 - val_loss: 0.0139
Epoch 469/500
 - 0s - loss: 0.0268 - val_loss: 0.0134
Epoch 470/500
 - 0s - loss: 0.0299 - val_loss: 0.0121
Epoch 471/500
 - 0s - loss: 0.0285 - val_loss: 0.0120
Epoch 472/500
 - 0s - loss: 0.0273 - val_loss: 0.0122
Epoch 473/500
 - 0s - loss: 0.0278 - val_loss: 0.0142
Epoch 474/500
 - 0s - loss: 0.0277 - val_loss: 0.0111
Epoch 475/500
 - 0s - loss: 0.0301 - val_loss: 0.0127
Epoch 476/500
 - 0s - loss: 0.0272 - val_loss: 0.0119
Epoch 477/500
 - 0s - loss: 0.0289 - val_loss: 0.0149
Epoch 478/500
 - 0s - loss: 0.0271 - val_loss: 0.0117
Epoch 479/500
 - 0s - loss: 0.0304 - val_loss: 0.0128
Epoch 480/500
 - 0s - loss: 0.0273 - val_loss: 0.0116
Epoch 481/500
 - 0s - loss: 0.0286 - val_loss: 0.0137
Epoch 482/500
 - 0s - loss: 0.0268 - val_loss: 0.0130
Epoch 483/500
 - 0s - loss: 0.0297 - val_loss: 0.0120
Epoch 484/500
 - 0s - loss: 0.0285 - val_loss: 0.0121
Epoch 485/500
 - 0s - loss: 0.0274 - val_loss: 0.0126
Epoch 486/500
 - 0s - loss: 0.0269 - val_loss: 0.0133
Epoch 487/500
 - 0s - loss: 0.0275 - val_loss: 0.0110
Epoch 488/500
 - 0s - loss: 0.0288 - val_loss: 0.0114
Epoch 489/500
 - 0s - loss: 0.0264 - val_loss: 0.0117
Epoch 490/500
 - 0s - loss: 0.0277 - val_loss: 0.0133
Epoch 491/500
 - 0s - loss: 0.0267 - val_loss: 0.0118
Epoch 492/500
 - 0s - loss: 0.0299 - val_loss: 0.0127
Epoch 493/500
 - 0s - loss: 0.0272 - val_loss: 0.0117
Epoch 494/500
 - 0s - loss: 0.0282 - val_loss: 0.0132
Epoch 495/500
 - 0s - loss: 0.0269 - val_loss: 0.0128
Epoch 496/500
 - 0s - loss: 0.0292 - val_loss: 0.0114
Epoch 497/500
 - 0s - loss: 0.0289 - val_loss: 0.0117
Epoch 498/500
 - 0s - loss: 0.0271 - val_loss: 0.0122
Epoch 499/500
 - 0s - loss: 0.0283 - val_loss: 0.0154
Epoch 500/500
 - 0s - loss: 0.0275 - val_loss: 0.0113
In [60]:
pyplot.plot(history['loss'], label='train')
pyplot.plot(history['val_loss'], label='validation')
pyplot.legend()
pyplot.show()

Test improved model on Validation Data

In [65]:
# make a prediction
%load_ext autoreload
%autoreload 2
import models
inv_yhat, inv_y, rmse=models.make_lstm_prediction(validation_X,validation_y,model,scaler)
print('LSTM Model on Validation Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
LSTM Model on Validation Data RMSE: 8.604
In [66]:
%load_ext autoreload
%autoreload 2
import visuals

visuals.plot_series_to_compare(inv_y,inv_yhat,"Actual Price","Predicted Price", "Iproved Model: Actual Price Versus LSTM Predicted Price on Validation Data")
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload

Test improved model on Unseen(Test) Data

In [67]:
# make a prediction
%load_ext autoreload
%autoreload 2
import models
inv_yhat, inv_y, rmse=models.make_lstm_prediction(test_X,test_y,model,scaler)
print('LSTM Moddel on Test Data RMSE: %.3f' % rmse)
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload
LSTM Moddel on Test Data RMSE: 8.663
In [68]:
%load_ext autoreload
%autoreload 2
import visuals

visuals.plot_series_to_compare(inv_y,inv_yhat,"Actual Price","Predicted Price", "Iproved Model: Actual Price Versus LSTM Predicted Price on Test Data")
The autoreload extension is already loaded. To reload it, use:
  %reload_ext autoreload